26,304 research outputs found
Oceans of Tomorrow sensor interoperability for in-situ ocean monitoring
The Oceans of Tomorrow (OoT) projects,
funded by the European Commission’s FP7 program, are
developing a new generation of sensors supporting physical,
biogeochemical and biological oceanographic monitoring.
The sensors range from acoustic to optical fluorometers to
labs on a chip. The result is that the outputs are diverse in a
variety of formats and communication methodologies. The
interfaces with platforms such as floats, gliders and cable
observatories are each different. Thus, sensorPeer ReviewedPostprint (author's final draft
Defining the intelligent public sector construction client
Recent efforts and aspirations to transform the delivery of major capital programmes and projects in UK public sector construction by focussing on achievement of value for money, whole life asset management and sustainable procurement have led to the adoption of integrated procurement routes characterised by multiplicity of stakeholders with a diversity of differing and often competing requirements. A study of the challenges faced by the public sector to deliver present and future major capital programmes and projects gravitates to the role of the intelligent client, and concomitant skills and capabilities. The results of the multiple case studies research show that the challenges of this role are especially evident at the interface between the internal organisation and the external suppliers and advisors from the private sector. The research concludes that the intelligent client role requires an individual champion with a unique set of skills working in an environment of a supporting team and capable organisation
SWE bridge: software interface for plug & work instrument integration into marine observation platforms
The integration of sensor systems into marine
observation platforms such as gliders, cabled observatories
and smart buoys requires a great deal of effort due to the
diversity of architectures present in the marine acquisition
systems. In the past years important steps have been taken in
order to improve both standardization and interoperability,
i.e. the Open Geospatial Consortium’s Sensor Web
Enablement. This set of standards and protocols provide a
well
-defined framework to achieve standardized data chains.
However a significant gap is still present in the lower
-end of
the data chain, between the sensor systems and the
acquisition platforms. In this work a standard
s
-based
architecture to bridge this gap is proposed in order to achieve
plug & work, standardized and interoperable acquisition
systems.Award-winningPostprint (published version
Designing Web-enabled services to provide damage estimation maps caused by natural hazards
The availability of building stock inventory data and demographic information is an important requirement for risk assessment studies when attempting to predict and estimate losses due to natural hazards such as earthquakes, storms, floods or tsunamis. The better this information is provided, the more accurate are predictions on damage to structures and lifelines and the better can expected impacts on the population be estimated. When a disaster strikes, a map is often one of the first requirements for answering questions related to location, casualties and damage zones caused by the event. Maps of appropriate scale that represent relative and absolute damage distributions may be of great importance for rescuing lives and properties, and for providing relief. However, this type of maps is often difficult to obtain during the first hours or even days after the occurrence of a natural disaster. The Open Geospatial Consortium Web Services (OWS) Specifications enable access to datasets and services using shared, distributed and interoperable environments through web-enabled services. In this paper we propose the use of OWS in view of these advantages as a possible solution for issues related to suitable dataset acquisition for risk assessment studies. The design of web-enabled services was carried out using the municipality of Managua (Nicaragua) and the development of damage and loss estimation maps caused by earthquakes as a first case study. Four organizations located in different places are involved in this proposal and connected through web services, each one with a specific role
Moving Object Trajectories Meta-Model And Spatio-Temporal Queries
In this paper, a general moving object trajectories framework is put forward
to allow independent applications processing trajectories data benefit from a
high level of interoperability, information sharing as well as an efficient
answer for a wide range of complex trajectory queries. Our proposed meta-model
is based on ontology and event approach, incorporates existing presentations of
trajectory and integrates new patterns like space-time path to describe
activities in geographical space-time. We introduce recursive Region of
Interest concepts and deal mobile objects trajectories with diverse
spatio-temporal sampling protocols and different sensors available that
traditional data model alone are incapable for this purpose.Comment: International Journal of Database Management Systems (IJDMS) Vol.4,
No.2, April 201
Recommended from our members
An ECOOP web portal for visualising and comparing distributed coastal oceanography model and in situ data
As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations
Programming patterns and development guidelines for Semantic Sensor Grids (SemSorGrid4Env)
The web of Linked Data holds great potential for the creation of semantic applications that can combine self-describing structured data from many sources including sensor networks. Such applications build upon the success of an earlier generation of 'rapidly developed' applications that utilised RESTful APIs. This deliverable details experience, best practice, and design patterns for developing high-level web-based APIs in support of semantic web applications and mashups for sensor grids. Its main contributions are a proposal for combining Linked Data with RESTful application development summarised through a set of design principles; and the application of these design principles to Semantic Sensor Grids through the development of a High-Level API for Observations. These are supported by implementations of the High-Level API for Observations in software, and example semantic mashups that utilise the API
Hydrological Models as Web Services: An Implementation using OGC Standards
<p>Presentation for the HIC 2012 - 10th International Conference on Hydroinformatics. "Understanding Changing Climate and Environment and Finding Solutions" Hamburg, Germany July 14-18, 2012</p>
<p> </p
- …
