110 research outputs found

    Influence, information and team outcomes in large scale software development

    Get PDF

    Social aspects of collaboration in online software communities

    Get PDF

    Progress in operational modeling in support of oil spill response

    Get PDF
    Following the 2010 Deepwater Horizon accident of a massive blow-out in the Gulf of Mexico, scientists from government, industry, and academia collaborated to advance oil spill modeling and share best practices in model algorithms, parameterizations, and application protocols. This synergy was greatly enhanced by research funded under the Gulf of Mexico Research Initiative (GoMRI), a 10-year enterprise that allowed unprecedented collection of observations and data products, novel experiments, and international collaborations that focused on the Gulf of Mexico, but resulted in the generation of scientific findings and tools of broader value. Operational oil spill modeling greatly benefited from research during the GoMRI decade. This paper provides a comprehensive synthesis of the related scientific advances, remaining challenges, and future outlook. Two main modeling components are discussed: Ocean circulation and oil spill models, to provide details on all attributes that contribute to the success and limitations of the integrated oil spill forecasts. These forecasts are discussed in tandem with uncertainty factors and methods to mitigate them. The paper focuses on operational aspects of oil spill modeling and forecasting, including examples of international operational center practices, observational needs, communication protocols, and promising new methodologies

    2015 Oil Observing Tools: A Workshop Report

    Get PDF
    Since 2010, the National Oceanic and Atmospheric Administration (NOAA) and the National Aeronautics and Space Administration (NASA) have provided satellite-based pollution surveillance in United States waters to regulatory agencies such as the United States Coast Guard (USCG). These technologies provide agencies with useful information regarding possible oil discharges. Unfortunately, there has been confusion as to how to interpret the images collected by these satellites and other aerial platforms, which can generate misunderstandings during spill events. Remote sensor packages on aircraft and satellites have advantages and disadvantages vis-à-vis human observers, because they do not “see” features or surface oil the same way. In order to improve observation capabilities during oil spills, applicable technologies must be identified, and then evaluated with respect to their advantages and disadvantages for the incident. In addition, differences between sensors (e.g., visual, IR, multispectral sensors, radar) and platform packages (e.g., manned/unmanned aircraft, satellites) must be understood so that reasonable approaches can be made if applicable and then any data must be correctly interpreted for decision support. NOAA convened an Oil Observing Tools Workshop to focus on the above actions and identify training gaps for oil spill observers and remote sensing interpretation to improve future oil surveillance, observation, and mapping during spills. The Coastal Response Research Center (CRRC) assisted NOAA’s Office of Response and Restoration (ORR) with this effort. The workshop was held on October 20-22, 2015 at NOAA’s Gulf of Mexico Disaster Response Center in Mobile, AL. The expected outcome of the workshop was an improved understanding, and greater use of technology to map and assess oil slicks during actual spill events. Specific workshop objectives included: •Identify new developments in oil observing technologies useful for real-time (or near real-time) mapping of spilled oil during emergency events. •Identify merits and limitations of current technologies and their usefulness to emergency response mapping of oil and reliable prediction of oil surface transport and trajectory forecasts.Current technologies include: the traditional human aerial observer, unmanned aircraft surveillance systems, aircraft with specialized senor packages, and satellite earth observing systems. •Assess training needs for visual observation (human observers with cameras) and sensor technologies (including satellites) to build skills and enhance proper interpretation for decision support during actual events

    Proceedings of the ECCS 2005 satellite workshop: embracing complexity in design - Paris 17 November 2005

    Get PDF
    Embracing complexity in design is one of the critical issues and challenges of the 21st century. As the realization grows that design activities and artefacts display properties associated with complex adaptive systems, so grows the need to use complexity concepts and methods to understand these properties and inform the design of better artifacts. It is a great challenge because complexity science represents an epistemological and methodological swift that promises a holistic approach in the understanding and operational support of design. But design is also a major contributor in complexity research. Design science is concerned with problems that are fundamental in the sciences in general and complexity sciences in particular. For instance, design has been perceived and studied as a ubiquitous activity inherent in every human activity, as the art of generating hypotheses, as a type of experiment, or as a creative co-evolutionary process. Design science and its established approaches and practices can be a great source for advancement and innovation in complexity science. These proceedings are the result of a workshop organized as part of the activities of a UK government AHRB/EPSRC funded research cluster called Embracing Complexity in Design (www.complexityanddesign.net) and the European Conference in Complex Systems (complexsystems.lri.fr). Embracing complexity in design is one of the critical issues and challenges of the 21st century. As the realization grows that design activities and artefacts display properties associated with complex adaptive systems, so grows the need to use complexity concepts and methods to understand these properties and inform the design of better artifacts. It is a great challenge because complexity science represents an epistemological and methodological swift that promises a holistic approach in the understanding and operational support of design. But design is also a major contributor in complexity research. Design science is concerned with problems that are fundamental in the sciences in general and complexity sciences in particular. For instance, design has been perceived and studied as a ubiquitous activity inherent in every human activity, as the art of generating hypotheses, as a type of experiment, or as a creative co-evolutionary process. Design science and its established approaches and practices can be a great source for advancement and innovation in complexity science. These proceedings are the result of a workshop organized as part of the activities of a UK government AHRB/EPSRC funded research cluster called Embracing Complexity in Design (www.complexityanddesign.net) and the European Conference in Complex Systems (complexsystems.lri.fr)

    Meta-standardisation of Interoperability Protocols

    Get PDF
    The current medley of interoperability protocols is potentially problematic. Each protocol is designed by a different group, each provides a single service, and has its own syntax and vocabulary. Popular protocols such as RSS are designed with simple and easy to understand documentation, which is a key factor for the high adoption levels. But the majority of protocols are complex, making them relatively difficult for programmers to understand and implement. This research proposes a possible new direction for high-level interoperability protocols design. The High-level Interoperability Protocol - Common Framework (HIP-CF) is designed and evaluated as a proof of concept that if interoperability is made simpler, then it can increase adoption levels, making it easier for programmers to understand and implement protocols, therefore leading to more interoperable systems. HIP-CF is not suggested as the alternative to current production protocols. Rather it is suggested that the design approach taken by HIP-CF can be applied to other protocols, and also that a suite of simpler protocols is a better solution than various simple individual protocols. Evaluation results show that current protocols can be substantially improved on. These improvements could and maybe should be the result of a deeper analysis of the goals of today’s protocols and also a collaboration amongst the different groups that design high-level interoperability protocols. This research presents a new approach and suggests future experimental research options for the field of high-level interoperability protocol design

    On the Use of Process Trails to Understand Software Development

    Full text link

    VideoWall Bench: A Benchmark for Evaluating Hardware Accelerated Video Decoding on Linux

    Get PDF
    VideoWall Bench is a benchmark script for benchmarking video decoding capabilities using hardware acceleration on Linux. Intel has introduced Video Acceleration API (VA-API) which enabled and provides access for graphics hardware to do hardware acceleration. VA API provides a set of video decoders (Codecs) for the H.264 video standards. Multiple video decoding using video wall methodology is a method of benchmarking that be implemented in this script. Using this method, users can really stress the multiple video decoding capabilities of one platform and at the same time measure processor usage for video decoding process. VideoWall Bench benchmark video decoding performance by measuring processor utilization, memory utilization, total frame rate per second (FPS) and time fluctuation in video decoding process. Additionally, VideoWall Bench also includes set
    • …
    corecore