539 research outputs found

    OIL SPILL MODELING FOR IMPROVED RESPONSE TO ARCTIC MARITIME SPILLS: THE PATH FORWARD

    Get PDF
    Maritime shipping and natural resource development in the Arctic are projected to increase as sea ice coverage decreases, resulting in a greater probability of more and larger oil spills. The increasing risk of Arctic spills emphasizes the need to identify the state-of-the-art oil trajectory and sea ice models and the potential for their integration. The Oil Spill Modeling for Improved Response to Arctic Maritime Spills: The Path Forward (AMSM) project, funded by the Arctic Domain Awareness Center (ADAC), provides a structured approach to gather expert advice to address U.S. Coast Guard (USCG) Federal On-Scene Coordinator (FOSC) core needs for decision-making. The National Oceanic & Atmospheric Administration (NOAA) Office of Response & Restoration (OR&R) provides scientific support to the USCG FOSC during oil spill response. As part of this scientific support, NOAA OR&R supplies decision support models that predict the fate (including chemical and physical weathering) and transport of spilled oil. Oil spill modeling in the Arctic faces many unique challenges including limited availability of environmental data (e.g., currents, wind, ice characteristics) at fine spatial and temporal resolution to feed models. Despite these challenges, OR&R’s modeling products must provide adequate spill trajectory predictions, so that response efforts minimize economic, cultural and environmental impacts, including those to species, habitats and food supplies. The AMSM project addressed the unique needs and challenges associated with Arctic spill response by: (1) identifying state-of-the-art oil spill and sea ice models, (2) recommending new components and algorithms for oil and ice interactions, (3) proposing methods for improving communication of model output uncertainty, and (4) developing methods for coordinating oil and ice modeling efforts

    A real-time distributed analysis automation for hurricane surface wind observations

    Get PDF
    From 1993 until 1999, the Hurricane Research Division of the National Oceanic and Atmospheric Administration (NOAA) produced real-time analyses of surface wind observations to help determine a storm\u27s wind intensity and extent. Limitations of the real-time analysis system included platform and filesystem dependency, lacking data integrity and feasibility for Internet deployment. In 2000, a new system was developed, built upon a Java prototype of a quality control graphical client interface for wind observations and an object-relational database. The objective was to integrate them in a distributed object approach with the legacy code responsible for the actual real-time wind analysis and image product generation. Common Object Request Broker Architecture (CORBA) was evaluated, but Java Remote Method Invocation (AMI) offered important advantages in terms of reuse and deployment. Even more substantial, though, were the efforts towards object-oriented redesign, implementation and testing of the quality control interface and its database performance interaction. As a result, a full-featured application can now be launched from the Web, potentially accessible by tropical cyclone forecast and warning centers worldwide

    National freight transport planning: towards a Strategic Planning Extranet Decision Support System (SPEDSS)

    Get PDF
    This thesis provides a `proof-of-concept' prototype and a design architecture for a Object Oriented (00) database towards the development of a Decision Support System (DSS) for the national freight transport planning problem. Both governments and industry require a Strategic Planning Extranet Decision Support System (SPEDSS) for their effective management of the national Freight Transport Networks (FTN). This thesis addresses the three key problems for the development of a SPEDSS to facilitate national strategic freight planning: 1) scope and scale of data available and required; 2) scope and scale of existing models; and 3) construction of the software. The research approach taken embodies systems thinking and includes the use of: Object Oriented Analysis and Design (OOA/D) for problem encapsulation and database design; artificial neural network (and proposed rule extraction) for knowledge acquisition of the United States FTN data set; and an iterative Object Oriented (00) software design for the development of a `proof-of-concept' prototype. The research findings demonstrate that an 00 approach along with the use of 00 methodologies and technologies coupled with artificial neural networks (ANNs) offers a robust and flexible methodology for the analysis of the FTN problem domain and the design architecture of an Extranet based SPEDSS. The objectives of this research were to: 1) identify and analyse current problems and proposed solutions facing industry and governments in strategic transportation planning; 2) determine the functional requirements of an FTN SPEDSS; 3) perform a feasibility analysis for building a FTN SPEDSS `proof-of-concept' prototype and (00) database design; 4) develop a methodology for a national `internet-enabled' SPEDSS model and database; 5) construct a `proof-of-concept' prototype for a SPEDSS encapsulating identified user requirements; 6) develop a methodology to resolve the issue of the scale of data and data knowledge acquisition which would act as the `intelligence' within a SPDSS; 7) implement the data methodology using Artificial Neural Networks (ANNs) towards the validation of it; and 8) make recommendations for national freight transportation strategic planning and further research required to fulfil the needs of governments and industry. This thesis includes: an 00 database design for encapsulation of the FTN; an `internet-enabled' Dynamic Modelling Methodology (DMM) for the virtual modelling of the FTNs; a Unified Modelling Language (UML) `proof-of-concept' prototype; and conclusions and recommendations for further collaborative research are identified

    Modeling operational forestry problems in central Appalachian hardwood forests

    Get PDF
    Because of the species diversity, varied site conditions and growth rates, it is really challenging to manage the central Appalachian hardwoods. Examining the harvesting techniques and interactions among stand, harvest, and machines is becoming a concern to the researchers in the region. A simulation system was developed to aid these efforts by estimating the productivity, cost, and traffic intensity of different harvesting configurations under a variety of harvesting prescriptions and stand conditions.;Stands used in the simulation were generated by using the stand generator that was validated by comparing the generated stands with the actual mapped stands statistically. Results indicated its validity and have shown that it can be used to visualize the stand structure and composition of hardwood stands and perform dynamic analyses of various management prescriptions.;Three harvesting systems of chainsaw (CS)/cable skidder (CD), feller-buncher (FB)/grapple skidder (GD), and harvester (HV)/forwarder (FW) were modeled and simulated on five generated stands of different ages in the study. Five harvest methods of clearcut, shelterwood cut, crop tree release cut, diameter limit cut, and selective cut were examined. Simulation results showed that felling production and cost were primarily affected by tree size removed, removal intensity, distance traveled between harvested trees, and felling machines. The feller-buncher was the most cost-effective and productive machine and harvester was more sensitive to individual tree size (DBH). Clearcutting always presented the highest productivity while the shelterwood cut was the least productive method. Unit cost of harvester was higher than that of feller-buncher or chainsaw. Extraction operation was sensitive to payload size, average extraction distance, bunch size, extraction pattern, and extraction machine. The forwarder was the most productive machine under the simulated extraction prescriptions. The cable skidder resulted in higher unit cost than that of grapple skidder or forwarder.;System productivity increased from chainsaw/cable skidder system to harvester/forwarder system, and to feller-buncher/grapple skidder system. The feller-buncher/grapple skidder system could produce 28484 ft3 or 177 thousand board feet (MBF) per week with a unit cost of {dollar}27 per 100 cubit feet (cubit) or {dollar}44/MBF. For chainsaw/cable skidder and harvester/forwarder systems, the weekly production rate was 12146 ft3 (76 MBF) and 16714 ft3 (104 MBF), with unit cost of {dollar}35/cunit ({dollar}57/MBF) and {dollar}44/cubit ({dollar}70 MBF), respectively.;TI3 and TI4 are the major concerns since they caused the most soil compaction. Harvester/forwarder system was associated with more unaffected areas while fellbuncher/grapple skidder system affected more areas. TI3 and TI4 level was 20% of the total area affected with harvester/forwarder, 23% with chainsaw/cable skidder system, and 44% with feller-buncher/grapple skidder system. A total of 49% of extraction site was recorded as TI3 and TI4 level for SP1, which was more than two times higher than that recorded for SP5

    Software framework for geophysical data processing, visualization and code development

    Get PDF
    IGeoS is an integrated open-source software framework for geophysical data processing under development at the UofS seismology group. Unlike other systems, this processing monitor supports structured multicomponent seismic data streams, multidimensional data traces, and employs a unique backpropagation execution logic. This results in an unusual flexibility of processing, allowing the system to handle nearly any geophysical data. In this project, a modern and feature-rich Graphical User Interface (GUI) was developed for the system, allowing editing and submission of processing flows and interaction with running jobs. Multiple jobs can be executed in a distributed multi-processor networks and controlled from the same GUI. Jobs, in their turn, can also be parallelized to take advantage of parallel processing environments such as local area networks and Beowulf clusters. A 3D/2D interactive display server was created and integrated with the IGeoS geophysical data processing framework. With introduction of this major component, the IGeoS system becomes conceptually complete and potentially bridges the gap between the traditional processing and interpretation software. Finally, in a specialized application, network acquisition and relay components were written allowing IGeoS to be used for real-time applications. The completion of this functionality makes the processing and display capabilities of IGeoS available to multiple streams of seismic data from potentially remote sites. Seismic data can be acquired, transferred to the central server, processed, archived, and events picked and placed in database completely automatically

    Efficient integration of software components for scientific simulations

    Get PDF
    Abstract unavailable please refer to PD

    Designing web-based adaptive learning environment : distils as an example

    Get PDF
    In this study, two components are developed for the Web-based adaptive learning: an online Intelligent Tutoring Tool (ITT) and an Adaptive Lecture Guidance (ALG). The ITT provides students timely problem-solving help in a dynamic Web environment. The ALG prevents students from being disoriented when a new domain is presented using Web technology. A prototype, Distributed Intelligent Learning System (DISTILS), has been implemented in a general chemistry laboratory domain. In DISTILS, students interact with the ITT through a Web browser. When a student selects a problem, the problem is formatted and displayed in the user interface for the student to solve. On the other side, the ITT begins to solve the problem simultaneously. The student can then request help from the ITT through the interface. The ITT interacts with the student, verifying those solution activities in an ascending order of the student knowledge status. In DISTILS, a Web page is associated with a HTML Learning Model (HLM) to describe its knowledge content. The ALG extracts the HLM, collects the status of students\u27 knowledge in HLM, and presents a knowledge map illustrating where the student is, how much proficiency he/she already has and where he/she is encouraged to explore. In this way, the ALG helps students to navigate the Web-based course material, protecting them from being disoriented and giving them guidance in need. Both the ITT and ALG components are developed under a generic Common Object Request Broker Architecture (CORBA)-driven framework. Under this framework, knowledge objects model domain expertise, a student modeler assesses student\u27s knowledge progress, an instruction engine includes two tutoring components, such as the ITT and the ALG, and the CORBA-compatible middleware serves as the communication infrastructure. The advantage of such a framework is that it promotes the development of modular and reusable intelligent educational objects. In DISTILS, a collection of knowledge objects were developed under CORBA to model general chemistry laboratory domain expertise. It was shown that these objects can be easily assembled in a plug-and-play manner to produce several exercises for different laboratory experiments. Given the platform independence of CORBA, tutoring objects developed under such a framework have the potential to be easily reused in different applications. Preliminary results showed that DISTILS effectively enhanced learning in Web environment. Three high school students and twenty-two NJIT students participated in the evaluation of DISTILS. In the final quiz of seven questions, the average correct answers of the students who studied in a Web environment with DISTILS (DISTILS Group) was 5.3, and the average correct answers of those who studied in the same Web environment without DISTILS (NoDISTILS Group) was 2.75. A t-test conducted on this small sample showed that the DISTILS group students significantly scored better than the NoDISTILS group students

    Satellite Communications

    Get PDF
    This study is motivated by the need to give the reader a broad view of the developments, key concepts, and technologies related to information society evolution, with a focus on the wireless communications and geoinformation technologies and their role in the environment. Giving perspective, it aims at assisting people active in the industry, the public sector, and Earth science fields as well, by providing a base for their continued work and thinking
    corecore