145 research outputs found

    Sensor to User - NASA/EOS Data for Coastal Zone Management Applications Developed from Integrated Analyses: Verification, Validation and Benchmark Report

    Get PDF
    The NASA Applied Sciences Program seeks to transfer NASA data, models, and knowledge into the hands of end-users by forming links with partner agencies and associated decision support tools (DSTs). Through the NASA REASoN (Research, Education and Applications Solutions Network) Cooperative Agreement, the Oceanography Division of the Naval Research Laboratory (NRLSSC) is developing new products through the integration of data from NASA Earth-Sun System assets with coastal ocean forecast models and other available data to enhance coastal management in the Gulf of Mexico. The recipient federal agency for this research effort is the National Oceanic and Atmospheric Administration (NOAA). The contents of this report detail the effort to further the goals of the NASA Applied Sciences Program by demonstrating the use of NASA satellite products combined with data-assimilating ocean models to provide near real-time information to maritime users and coastal managers of the Gulf of Mexico. This effort provides new and improved capabilities for monitoring, assessing, and predicting the coastal environment. Coastal managers can exploit these capabilities through enhanced DSTs at federal, state and local agencies. The project addresses three major issues facing coastal managers: 1) Harmful Algal Blooms (HABs); 2) hypoxia; and 3) freshwater fluxes to the coastal ocean. A suite of ocean products capable of describing Ocean Weather is assembled on a daily basis as the foundation for this semi-operational multiyear effort. This continuous realtime capability brings decision makers a new ability to monitor both normal and anomalous coastal ocean conditions with a steady flow of satellite and ocean model conditions. Furthermore, as the baseline data sets are used more extensively and the customer list increased, customer feedback is obtained and additional customized products are developed and provided to decision makers. Continual customer feedback and response with new improved products are required between the researcher and customer. This document details the methods by which these coastal ocean products are produced including the data flow, distribution, and verification. Product applications and the degree to which these products are used successfully within NOAA and coordinated with the Mississippi Department of Marine Resources (MDMR) is benchmarked

    Integrating biogeochemistry and ecology into ocean data assimilation systems

    Get PDF
    Monitoring and predicting the biogeochemical state of the ocean and marine ecosystems is an important application of operational oceanography that needs to be expanded. The accurate depiction of the ocean's physical environment enabled by Global Ocean Data Assimilation Experiment (GODAE) systems, in both real-time and reanalysis modes, is already valuable for various for various applications, such as the fishing industry and fisheries management. However, most of these applications require accurate estimates of both physical and biogeochemical ocean conditions over a wide range of spatial and temporal scales. In this paper, we discuss recent developments that enable coupling new biogeochemical models and assimilation components with the existing GODAE systems, and we examine the potential of such systems in several areas of interest: phytoplankton biomass monitoring in the open ocean, ocean carbon cycle monitoring and assessment, marine ecosystem management at seasonal and longer time scales, and downscaling in coastal areas. A number of key requirements and research priorities are then identified for the future, GODAE systems will need to improve their representation of physical variables that are not yet considered essential, such as upper-ocean vertical fluxes that are critically important to biological activity. Further, the observing systems will need to be expanded in terms of in situ platforms (with intensified deployments of sensors for O-2 and chlorophyll, and inclusion of new sensors for nutrients, zooplankton, micronekton biomass, and others), satellite missions (e.g., hyperspectral instruments for ocean color, lidar systems for mixed-layer depths, and wide-swath altimeters for coastal sea level), and improved methods to assimilate these new measurements

    Verification and Validation of NASA-Supported Enhancements to the Near Real Time Harmful Algal Blooms Observing System (HABSOS)

    Get PDF
    This report discusses verification and validation (V&V) assessment of Moderate Resolution Imaging Spectroradiometer (MODIS) ocean data products contributed by the Naval Research Laboratory (NRL) and Applied Coherent Technologies (ACT) Corporation to National Oceanic Atmospheric Administration s (NOAA) Near Real Time (NRT) Harmful Algal Blooms Observing System (HABSOS). HABSOS is a maturing decision support tool (DST) used by NOAA and its partners involved with coastal and public health management

    Marine climatological datasets for the Maltese Islands

    Get PDF
    During the last 25 years of activity, the Physical Oceanography Research Group, previously known as the PO-Unit, and currently established within the Department of Geosciences of the University of Malta, has been promoting the downscaling of broad scope marine core services to higher resolution local scale domains for the Maltese Islands. Several services are delivered either by an intrinsic data elaboration or by making use of and integrating the COPERNICUS Marine Environment Monitoring Service (CMEMS) data to local marine data streams. Local observations are also integrated with higher resolution forecasts for the preparation and provision of dedicated services that address real specific needs of sub-regional and coastal users. This effort has yielded valuable climatological datasets covering the Maltese coastal waters and spanning over several years. This work focusses on the climatologies derived from numerical models and satellites, and compiled within the Interreg MED programme AMAre (Actions for Marine Protected Areas) project.peer-reviewe

    Data Analytics for Automated Near Real Time Detection of Blockages in Smart Wastewater Systems

    Get PDF
    Blockage events account for a substantial portion of the reported failures in the wastewater network, causing flooding, loss of service, environmental pollution and significant clean-up costs. Increasing telemetry in Combined Sewer Overflows (CSOs) provides the opportunity for near real-time data-driven modelling of the sewer network. The research work presented in this thesis describes the development and testing of a novel system, designed for the automatic detection of blockages and other unusual events in near real-time. The methodology utilises an Evolutionary Artificial Neural Network (EANN) model for short term CSO level predictions and Statistical Process Control (SPC) techniques to analyse unusual CSO level behaviour. The system is designed to mimic the work of a trained, experience human technician in determining if a blockage event has occurred. The detection system has been applied to real blockage events from a UK wastewater network. The results obtained illustrate that the methodology can identify different types of blockage events in a reliable and timely manner, and with a low number of false alarms. In addition, a model has been developed for the prediction of water levels in a CSO chamber and the generation of alerts for upcoming spill events. The model consists of a bi-model committee evolutionary artificial neural network (CEANN), composed of two EANN models optimised for wet and dry weather, respectively. The models are combined using a non-linear weighted averaging approach to overcome bias arising from imbalanced data. Both methodologies are designed to be generic and self-learning, thus they can be applied to any CSO location, without requiring input from a human operator. It is envisioned that the technology will allow utilities to respond proactively to developing blockages events, thus reducing potential harm to the sewer network and the surrounding environment

    A dependability framework for WSN-based aquatic monitoring systems

    Get PDF
    Wireless Sensor Networks (WSN) are being progressively used in several application areas, particularly to collect data and monitor physical processes. Moreover, sensor nodes used in environmental monitoring applications, such as the aquatic sensor networks, are often subject to harsh environmental conditions while monitoring complex phenomena. Non-functional requirements, like reliability, security or availability, are increasingly important and must be accounted for in the application development. For that purpose, there is a large body of knowledge on dependability techniques for distributed systems, which provides a good basis to understand how to satisfy these non-functional requirements of WSN-based monitoring applications. Given the data-centric nature of monitoring applications, it is of particular importance to ensure that data is reliable or, more generically, that it has the necessary quality. The problem of ensuring the desired quality of data for dependable monitoring using WSNs is studied herein. With a dependability-oriented perspective, it is reviewed the possible impairments to dependability and the prominent existing solutions to solve or mitigate these impairments. Despite the variety of components that may form a WSN-based monitoring system, it is given particular attention to understanding which faults can affect sensors, how they can affect the quality of the information, and how this quality can be improved and quantified. Open research issues for the specific case of aquatic monitoring applications are also discussed. One of the challenges in achieving a dependable system behavior is to overcome the external disturbances affecting sensor measurements and detect the failure patterns in sensor data. This is a particular problem in environmental monitoring, due to the difficulty in distinguishing a faulty behavior from the representation of a natural phenomenon. Existing solutions for failure detection assume that physical processes can be accurately modeled, or that there are large deviations that may be detected using coarse techniques, or more commonly that it is a high-density sensor network with value redundant sensors. This thesis aims at defining a new methodology for dependable data quality in environmental monitoring systems, aiming to detect faulty measurements and increase the sensors data quality. The framework of the methodology is overviewed through a generically applicable design, which can be employed to any environment sensor network dataset. The methodology is evaluated in various datasets of different WSNs, where it is used machine learning to model each sensor behavior, exploiting the existence of correlated data provided by neighbor sensors. It is intended to explore the data fusion strategies in order to effectively detect potential failures for each sensor and, simultaneously, distinguish truly abnormal measurements from deviations due to natural phenomena. This is accomplished with the successful application of the methodology to detect and correct outliers, offset and drifting failures in real monitoring networks datasets. In the future, the methodology can be applied to optimize the data quality control processes of new and already operating monitoring networks, and assist in the networks maintenance operations.As redes de sensores sem fios (RSSF) têm vindo cada vez mais a serem utilizadas em diversas áreas de aplicação, em especial para monitorizar e capturar informação de processos físicos em meios naturais. Neste contexto, os sensores que estão em contacto direto com o respectivo meio ambiente, como por exemplo os sensores em meios aquáticos, estão sujeitos a condições adversas e complexas durante o seu funcionamento. Esta complexidade conduz à necessidade de considerarmos, durante o desenvolvimento destas redes, os requisitos não funcionais da confiabilidade, da segurança ou da disponibilidade elevada. Para percebermos como satisfazer estes requisitos da monitorização com base em RSSF para aplicações ambientais, já existe uma boa base de conhecimento sobre técnicas de confiabilidade em sistemas distribuídos. Devido ao foco na obtenção de dados deste tipo de aplicações de RSSF, é particularmente importante garantir que os dados obtidos na monitorização sejam confiáveis ou, de uma forma mais geral, que tenham a qualidade necessária para o objetivo pretendido. Esta tese estuda o problema de garantir a qualidade de dados necessária para uma monitorização confiável usando RSSF. Com o foco na confiabilidade, revemos os possíveis impedimentos à obtenção de dados confiáveis e as soluções existentes capazes de corrigir ou mitigar esses impedimentos. Apesar de existir uma grande variedade de componentes que formam ou podem formar um sistema de monitorização com base em RSSF, prestamos particular atenção à compreensão das possíveis faltas que podem afetar os sensores, a como estas faltas afetam a qualidade dos dados recolhidos pelos sensores e a como podemos melhorar os dados e quantificar a sua qualidade. Tendo em conta o caso específico dos sistemas de monitorização em meios aquáticos, discutimos ainda as várias linhas de investigação em aberto neste tópico. Um dos desafios para se atingir um sistema de monitorização confiável é a deteção da influência de fatores externos relacionados com o ambiente monitorizado, que afetam as medições obtidas pelos sensores, bem como a deteção de comportamentos de falha nas medições. Este desafio é um problema particular na monitorização em ambientes naturais adversos devido à dificuldade da distinção entre os comportamentos associados às falhas nos sensores e os comportamentos dos sensores afetados pela à influência de um evento natural. As soluções existentes para este problema, relacionadas com deteção de faltas, assumem que os processos físicos a monitorizar podem ser modelados de forma eficaz, ou que os comportamentos de falha são caraterizados por desvios elevados do comportamento expectável de forma a serem facilmente detetáveis. Mais frequentemente, as soluções assumem que as redes de sensores contêm um número suficientemente elevado de sensores na área monitorizada e, consequentemente, que existem sensores redundantes relativamente à medição. Esta tese tem como objetivo a definição de uma nova metodologia para a obtenção de qualidade de dados confiável em sistemas de monitorização ambientais, com o intuito de detetar a presença de faltas nas medições e aumentar a qualidade dos dados dos sensores. Esta metodologia tem uma estrutura genérica de forma a ser aplicada a uma qualquer rede de sensores ambiental ou ao respectivo conjunto de dados obtido pelos sensores desta. A metodologia é avaliada através de vários conjuntos de dados de diferentes RSSF, em que aplicámos técnicas de aprendizagem automática para modelar o comportamento de cada sensor, com base na exploração das correlações existentes entre os dados obtidos pelos sensores da rede. O objetivo é a aplicação de estratégias de fusão de dados para a deteção de potenciais falhas em cada sensor e, simultaneamente, a distinção de medições verdadeiramente defeituosas de desvios derivados de eventos naturais. Este objectivo é cumprido através da aplicação bem sucedida da metodologia para detetar e corrigir outliers, offsets e drifts em conjuntos de dados reais obtidos por redes de sensores. No futuro, a metodologia pode ser aplicada para otimizar os processos de controlo da qualidade de dados quer de novos sistemas de monitorização, quer de redes de sensores já em funcionamento, bem como para auxiliar operações de manutenção das redes.Laboratório Nacional de Engenharia Civi

    Development, Operation, and Results From the Texas Automated Buoy System

    Get PDF
    The Texas Automated Buoy System (TABS) is a coastal network of moored buoys that report near-real-time observations about currents and winds along the Texas coast. Established in 1995, the primary mission of TABS is ocean observations in the service of oil spill preparedness and response. The state of Texas funded the system with the intent of improving the data available to oil spill trajectory modelers. In its 12 years of operation, TABS has proven its usefulness during realistic oil spill drills and actual spills. The original capabilities of TABS, i.e., measurement of surface currents and temperatures, have been extended to the marine surface layer, the entire water column, and the sea floor. In addition to observations, a modeling component has been integrated into the TABS program. The goal is to form the core of a complete ocean observing system for Texas waters. As the nation embarks on the development of an integrated ocean observing system, TABS will continue to be an active participant of the Gulf of Mexico Coastal Ocean Observing System (GCOOS) regional association and the primary source of near-surface current measurements in the northwestern Gulf of Mexico. This article describes the origin of TABS, the philosophy behind the operation and development of the system, the resulting modifications to improve the system, the expansion of the system to include new sensors, the development of TABS forecasting models and real-time analysis tools, and how TABS has met many of the societal goals envisioned for GCOOS

    Ensuring Reliable Measurements In Remote Aquatic Sensor Networks

    Full text link
    A flood monitoring system comprises an extensive network of water sensors, a bundle of forecast simulations models, and a decision-support information system. A cascade of uncertainties present in each part of the system affects a reliable flood alert and response. The timeliness and quality of data gathering, used subsequently in forecasting models, is affected by the pervasive nature of the monitoring network where aquatic sensors are vulnerable to external disturbances affecting the accuracy of data acquisition. Existing solutions for aquatic monitoring are composed by heterogeneous sensors usually unable to ensure reliable measurements in complex scenarios, due to specific effects of each technology as transitional loss of availability, errors, limits of coverage, etc. In this paper, we introduce a more general study of all aspects of the criticality of sensor networks in the aquatic monitoring process, and we motivate for the need of reliable data collection in harsh coastal and marine environments. It is presented an overview of the main challenges such as the sensors power life, sensor hardware compatibility, reliability and long-range communication. These issues need to be addressed to improve the robustness of the sensors measurements. The development of solutions to automatically adjust the sensors measurements to each disturbance accordingly would provide an important increase on the quality of the measurements, thus supplying other parts of a flood monitoring system with dependable monitoring data. Also, with the purpose of providing software solutions to hardware failures, we introduce context-awareness techniques such as data processing, filtering and sensor fusion methods that were applied to a real working monitoring network with several proprietary probes (measuring conductivity, temperature, depth and various water quality parameters) in distant sites in Portugal. The goal is to assess the best technique to overcome each detected faulty measurement without compromising the time frame of the monitoring process
    corecore