8,532 research outputs found

    A sensor web architecture for integrating smart oceanographic sensors into the semantic sensor web

    Get PDF
    Effective ocean and coastal data management are needed to manage marine ecosystem health. Past ocean and coastal data management systems were often very specific to a particular application and region, but this focused approach often lacks real-time data and sharing/interoperating capability. The challenge for the ocean observing community is to devise standards and practices that enable integration of data from sensors across devices, manufacturers, users, and domains to enable new types of applications and services that facilitate much more comprehensive understanding and analyses of marine ecosystem. A given kind of sensor may be deployed on various platforms such as floats, gliders or moorings, and thus must be integrated with different operation, and data management systems. Simplifying the integration process in existing or newly established observing systems would benefit system operators and is important for the broader application of diverse sensors. This paper describes a geospatial “sensor web” architecture developed by the “NeXOS” project for ocean and coastal data management, based on the concepts of spatial data infrastructure and the Sensor Web Enablement framework of the Open Geospatial Consortium. This approach reduces the effort to propagate data from deployed sensors to users. To support the realization of the proposed Next generation Ocean Sensors (NeXOS) architecture, hardware and software specifications for a Smart Electronic Interface for Sensors and Instruments (SEISI) are described. SEISI specifies small lower-power electronics, minimal operating system, and standards-basedresearch software to enable web-based sharing, discovery, exchange, and processing of sensor observations as well as operation of sensor devices. An experimental scenario is presented in which sensor data from a low-power glider with low-bandwidth intermittent satellite communications is integrated into the geospatial sensor web using the NeXOS architecture.Postprint (author's final draft

    Applying OGC sensor web enablement to ocean observing systems

    Get PDF
    The complexity of marine installations for ocean observing systems has grown significantly in recent years. In a network consisting of tens, hundreds or thousands of marine instruments, manual configuration and integration becomes very challenging. Simplifying the integration process in existing or newly established observing systems would benefit system operators and is important for the broader application of different sensors. This article presents an approach for the automatic configuration and integration of sensors into an interoperable Sensor Web infrastructure. First, the sensor communication model, based on OGC's SensorML standard, is utilized. It serves as a generic driver mechanism since it enables the declarative and detailed description of a sensor's protocol. Finally, we present a data acquisition architecture based on the OGC PUCK protocol that enables storage and retrieval of the SensorML document from the sensor itself, and automatic integration of sensors into an interoperable Sensor Web infrastructure. Our approach adopts Efficient XML Interchange (EXI) as alternative serialization form of XML or JSON. It solves the bandwidth problem of XML and JSON.Peer ReviewedPostprint (author's final draft

    Oceans of Tomorrow sensor interoperability for in-situ ocean monitoring

    Get PDF
    The Oceans of Tomorrow (OoT) projects, funded by the European Commission’s FP7 program, are developing a new generation of sensors supporting physical, biogeochemical and biological oceanographic monitoring. The sensors range from acoustic to optical fluorometers to labs on a chip. The result is that the outputs are diverse in a variety of formats and communication methodologies. The interfaces with platforms such as floats, gliders and cable observatories are each different. Thus, sensorPeer ReviewedPostprint (author's final draft

    Views from the coalface: chemo-sensors, sensor networks and the semantic sensor web

    Get PDF
    Currently millions of sensors are being deployed in sensor networks across the world. These networks generate vast quantities of heterogeneous data across various levels of spatial and temporal granularity. Sensors range from single-point in situ sensors to remote satellite sensors which can cover the globe. The semantic sensor web in principle should allow for the unification of the web with the real-word. In this position paper, we discuss the major challenges to this unification from the perspective of sensor developers (especially chemo-sensors) and integrating sensors data in real-world deployments. These challenges include: (1) identifying the quality of the data; (2) heterogeneity of data sources and data transport methods; (3) integrating data streams from different sources and modalities (esp. contextual information), and (4) pushing intelligence to the sensor level

    Report of the SNOMS Project 2006 to 2012, SNOMS SWIRE NOCS Ocean Monitoring System. Part 1: Narrative description

    No full text
    The ocean plays a major role in controlling the concentration of carbon dioxide (CO2) in the atmosphere. Increasing concentrations of CO2 in the atmosphere are a threat to the stability of the earth’s climate. A better understanding of the controlling role of the ocean will improve predictions of likely future changes in climate and the impact of the uptake of CO2 itself on marine eco-systems caused by the associated acidification of the ocean waters. The SNOMS Project (SWIRE NOCS Ocean Monitoring System) is a ground breaking joint research project supported by the Swire Group Trust, the Swire Educational Trust, the China Navigation Company (CNCo) and the Natural Environment Research Council. It collects high quality data on concentrations of CO2 in the surface layer of the ocean. It contributes to the international effort to better quantify (and understand the driving processes controlling) the exchanges of CO2 between the ocean and the atmosphere. In 2006 and 2007 a system that could be used on a commercial ship to provide data over periods of several months with only limited maintenance by the ships crew was designed and assembled by NOCS. The system was fitted to the CNCo ship the MV Pacific Celebes in May 2007. The onboard system was supported by web pages that monitored the progress of the ship and the functioning of the data collection system. To support the flow of data from the ship to the archiving of the data at the Carbon Dioxide Information Analysis Center (CDIAC in the USA) data processing procedures were developed for the quality control and systematic handling of the data. Data from samples of seawater collected by the ships crew and analysed in NOC (730 samples) have been used to confirm the consistency of the data from the automated measurement system on the ship. To examine the data collected between 2007 and 2012 the movements of the ship are divided into 16 voyages. Initially The Celebes traded on a route circum-navigating the globe via the Panama and Suez Canals. In 2009 the route shifted to one between Australia and New Zealand to USA and Canada. Analysis of the data is an on going process. It has demonstrated that the system produces reliable data. Data are capable of improving existing estimates of seasonal variability. The work has improved knowledge of gas exchange processes. Data from the crew-collected-samples are helping improve our ability to estimate alkalinity in different areas. This helps with the study of ocean acidification. Data from the 9 round trips in the Pacific are currently being examined along with data made available by the NOAA-PMEL laboratory forming time series from 2004 to 2012. The data from the Pacific route are of considerable interest. One reason is that the data monitors variations in the fluxes of CO2 associated with the current that flows westwards along the equator. This is one of the major natural sources of CO2 from the ocean into the atmosphere

    Science Briefing Paper and Event 3

    Get PDF

    Workshop sensing a changing world : proceedings workshop November 19-21, 2008

    Get PDF

    A Two-Level Information Modelling Translation Methodology and Framework to Achieve Semantic Interoperability in Constrained GeoObservational Sensor Systems

    Get PDF
    As geographical observational data capture, storage and sharing technologies such as in situ remote monitoring systems and spatial data infrastructures evolve, the vision of a Digital Earth, first articulated by Al Gore in 1998 is getting ever closer. However, there are still many challenges and open research questions. For example, data quality, provenance and heterogeneity remain an issue due to the complexity of geo-spatial data and information representation. Observational data are often inadequately semantically enriched by geo-observational information systems or spatial data infrastructures and so they often do not fully capture the true meaning of the associated datasets. Furthermore, data models underpinning these information systems are typically too rigid in their data representation to allow for the ever-changing and evolving nature of geo-spatial domain concepts. This impoverished approach to observational data representation reduces the ability of multi-disciplinary practitioners to share information in an interoperable and computable way. The health domain experiences similar challenges with representing complex and evolving domain information concepts. Within any complex domain (such as Earth system science or health) two categories or levels of domain concepts exist. Those concepts that remain stable over a long period of time, and those concepts that are prone to change, as the domain knowledge evolves, and new discoveries are made. Health informaticians have developed a sophisticated two-level modelling systems design approach for electronic health documentation over many years, and with the use of archetypes, have shown how data, information, and knowledge interoperability among heterogenous systems can be achieved. This research investigates whether two-level modelling can be translated from the health domain to the geo-spatial domain and applied to observing scenarios to achieve semantic interoperability within and between spatial data infrastructures, beyond what is possible with current state-of-the-art approaches. A detailed review of state-of-the-art SDIs, geo-spatial standards and the two-level modelling methodology was performed. A cross-domain translation methodology was developed, and a proof-of-concept geo-spatial two-level modelling framework was defined and implemented. The Open Geospatial Consortium’s (OGC) Observations & Measurements (O&M) standard was re-profiled to aid investigation of the two-level information modelling approach. An evaluation of the method was undertaken using II specific use-case scenarios. Information modelling was performed using the two-level modelling method to show how existing historical ocean observing datasets can be expressed semantically and harmonized using two-level modelling. Also, the flexibility of the approach was investigated by applying the method to an air quality monitoring scenario using a technologically constrained monitoring sensor system. This work has demonstrated that two-level modelling can be translated to the geospatial domain and then further developed to be used within a constrained technological sensor system; using traditional wireless sensor networks, semantic web technologies and Internet of Things based technologies. Domain specific evaluation results show that twolevel modelling presents a viable approach to achieve semantic interoperability between constrained geo-observational sensor systems and spatial data infrastructures for ocean observing and city based air quality observing scenarios. This has been demonstrated through the re-purposing of selected, existing geospatial data models and standards. However, it was found that re-using existing standards requires careful ontological analysis per domain concept and so caution is recommended in assuming the wider applicability of the approach. While the benefits of adopting a two-level information modelling approach to geospatial information modelling are potentially great, it was found that translation to a new domain is complex. The complexity of the approach was found to be a barrier to adoption, especially in commercial based projects where standards implementation is low on implementation road maps and the perceived benefits of standards adherence are low. Arising from this work, a novel set of base software components, methods and fundamental geo-archetypes have been developed. However, during this work it was not possible to form the required rich community of supporters to fully validate geoarchetypes. Therefore, the findings of this work are not exhaustive, and the archetype models produced are only indicative. The findings of this work can be used as the basis to encourage further investigation and uptake of two-level modelling within the Earth system science and geo-spatial domain. Ultimately, the outcomes of this work are to recommend further development and evaluation of the approach, building on the positive results thus far, and the base software artefacts developed to support the approach
    corecore