43 research outputs found

    Implementing INSPIRE: the BRGM road trip INSPIRE Conference

    Get PDF
    International audienceImplementing INSPIRE is not something that you can address like a one shot action. If your data evolve, your “INSPIRE maker system” should evolve too. Maintaining two different systems to expose the same data is not sustainable for most of organizations then solutions must be found to include those updating processes. Efficient INSPIRE system architecture should then be strong enough to expose correctly your data, and flexible enough to authorize possibilities of evolution. Then implementing INSPIRE is much more than a mapping exercise to new vocabularies and features. This presentation shows the actions lead by BRGM in order to provide geosciences data through INSPIRE. It gives feedback on past actions, technologies tested and then draw the path BRGM currently follow to reach INSPIRE compliancy. Finally it emphasis crucial innovations and actions that should be done to make INSPIRE road a more quiet trip

    What If ?

    Get PDF
    International audienc

    SensorThings API - overview and example of implementation in Theia/OZCAR

    No full text
    International audienceThis document presents :- SensorThings API- A mass movement to the SensorThings API - Theia/OZCAR feedbackIt was presented at the "Forum OGC France" event :https://github.com/opengeospatial/ForumOGCFrance/tree/main/J2I/J2I_2024/Presentation

    OGC Environmental Linked Features Interoperability Experiment Engineering Report

    No full text
    International audienceSystems that maintain and disseminate information representing and/or related to spatial features often lack mechanisms to describe or discover how features relate to each other, to other kinds of features, and to a wide variety of related information that may be relevant. The Environmental Linked Features Interoperability Experiment (ELFIE) explored Open Geospatial Consortium (OGC) and World Wide Web Consortium (W3C) standards with the goal of establishing a best practice for exposing cross-domain links between environmental domain and sampling features. The Interoperability Experiment (IE) focused on encoding relationships between cross-domain features and linking available observations data to sampled domain features. An approach that leverages the OGC service baseline, W3C data on the web best practices, and JavaScript Object Notation for Linked Data (JSON-LD) contexts was developed and evaluated. Outcomes of the experiment demonstrate that broadly accepted web technologies for linked data can be applied using OGC services and domain data models to fill important gaps in existing environmental data systems' capabilities. While solutions were found to be capable and promising, OGC services and domain model implementations have limited utility for use in linked data applications in their current state and the universe of persistent URIs that form the foundation of a linked data infrastructure is still small. In addition to improvement of the standards baseline and publication of linked data URIs, establishing conventions for URI dereferencing behavior and default content given multiple options for a resource remain for future work

    Rapid response tools for operational management of seismic crisis on a border area: case-study of the Pyrenees

    No full text
    International audienceThe Pyrenees are a 400-km-long mountain range located in southwest Europe along the French–Spanish border, and constitute one of the most earthquake-prone regions of mainland France and Spain.At the “observational” level, the Pyrenean region is monitored by several seismological networks on both sides of the French -Spanish border, counting in total around 120 seismic stations (of different types). Thanks to a progressive decrease of constraints associated to real-time seismology (generalization of low-cost robust data-transfer technologies, continuous increasing of data storage capacities, etc.), a growing proportion of these stations are progressively called to evolve toward real-time data transmission. Moreover, a recent project called “SISPyr” (www.sispyr.eu), involving the main owners of Pyrenean seismic stations, has notably allowed the establishment of a real-time pooling process of Pyrenean seismological data resulting in an improvement of the coverage of the massif.At the “operational” level also, each country has its own civil protection organization as well as specific earthquake crisis plans. However, big earthquakes in the Pyrenees can impact the two (or the three) borders. Moreover, systemic cross-border interactions are multiples (transport network, energy lifelines, hospitals access, cross-border populations, etc.).Rapid response overviewExperience of past earthquakes as well as "earthquake" civil-protection’s exercises underlines the need for crisis managers to have at their disposal rapid-response tools able to assess consequences caused by earthquakes, even for moderate events. SisPyr partnership has developed tools to meet these operational needs, in order to automatically and quickly (15 min) produce maps of seismic ground-motions. These “Shakemaps” integrate both seismological real-time data coming from observatories and internet citizen data (web-questionnaires). Exploratory tracks are also being considered in order to enrich the information feedback from the field by using techniques of "crowd-sourcing", thanks to the use of distributed "citizen" sensors or of social-networks.To go further in taking account operational requirements related to the management of seismic crisis, work is being done in order to provide the authorities with a quick assessment of the human tolls (potential victims or damages, needs for shelters) that may control their actions, structured within reports dedicated to civil-protection teams. Need for geospatial ICT supportMulti-actors contextPyrenean region disposes of several seismological networks on both sides of the Franco-Spanish border: CEA-LDG, OMP and BRGM for the French part (with stations belonging to the Seismic Monitoring National Network – RéNaSS, and to the French Permanent Accelerometric network – RAP), and the Spanish and the Catalan seismological surveys (respectively IGN and ICGC) for the Spanish part. It is also to notice the presence of a broad-band station in Andorra managed by Andorran Studies Institute (IEA). In case of earthquake, several of these institutes produce their own assessment of magnitude/location, while alert itself is assigned in France to CEA-LDG, and in Spain to IGN. At the same time, in case of great earthquakes, international organizations like JRC and CSEM (Europe), and even USGS (US), produces information bulletins, which are not really followed by national/regional crisis management community. Moreover in France Internet citizen data (macroseismic intensities) are collected by another institution, BCSF. In Spain these citizen data are collected both by ICGC and IGN. The Sispyr’s Shakemap system is triggered by alerts coming from IGN (disregarding ones coming from CEA-LDG), and uses IGN, ICGC, OMP and BRGM seismic data, as well as IGN, ICGC and BCSF citizen data. Input data interoperability Due to the multiplicity of data-producers, question of interoperability of input data is critical. Regarding real-time collection of instrumental data, different protocols are used (NAQS, Seedlink and Scream!), corresponding to well-known or accepted “standards” in the scientific community. In any case, the goal was to converge to shakemaps needed input files format and standards. Output data Shakemaps produces intensity maps, as image format or kmz files. This restitution format is not really adapted to crisis managers which work with own GIS platforms. Otherwise, intensity maps are still quite difficult to be interpreted in local-regional crisis management centers, for people which are not familiar with seismic risk. Thus, the incoming step is to produce automatically a “human tolls bulletins” estimating the level of the earthquake according to estimated potential victims or no-shelters populations. These bulletins follow a “light color code”. Red color signification for USGS, European Commission, Spanish and French Civil protection would be the same? Which coherence with other natural risks? (cf. discussions about natural risks zones maps in INSPIRE). Limits-Is Web diffusion of bulletins and maps well adapted for crisis management community?-How communicate in understandable and interoperable way information about outputs’ uncertainties? This issue also questions the notion of responsibility of broadcasters of these data with respect to crisis managers?-In case of significant earthquake, international organizations produce information bulletins, which are not really followed by national/regional crisis management community.-Input data interoperability: de facto standards are used, but are they international or own?-Output data: how to allow reuse of outputs

    Integration of European boreholes data and their dissemination through international interoperable standards

    Get PDF
    International audienceThe Geological information and modelling Thematic Core Service (TCS) of EPOS is designed as an efficient and sustainable access system for geological multi-scale datasets for EPOS. The TCS develops and benefits from the synergy between the existing data infrastructures of the Geological Surveys of Europe (EuroGeoSurveys / EGDI) and the large amount of information produced by the research organizations and the international drilling community. The integration of distributed infrastructure components allows a broad range of resources including: geological maps, borehole data, borehole associated observations (borehole log data, groundwater level, groundwater quality…) and archived information on physical material (samples, cores), geological models (3D, 4D), geohazards, geophysical data such as active seismic data and other analyses of rocks, soils and minerals. In this presentation, we focus on the European Borehole Index and the work done since the beginning of the project, first to specify an interoperable data exchange mechanism based on international standards (such as INSPIRE, OGC) implemented by all TCS data providers. Then to collect this information from the data provider, quality check and disseminate it from the TCS Central Node as a service provider to the EPOS community using the same interoperable standards. We will develop on the problems encountered to manage large amount of data and the solutions we tested and applied. We will present how the Borehole Index was specified in order to guarantee its compliance with INSPIRE European Directive and how the OGC community was engaged to improve and promote technologies for geoscience data description and sharing through its Geoscience Domain Working Group. In addition, we will present expected workflows for the integration of other existing and new data such as 3D/4D models and how our work fits in EPOS system to create an efficient and comprehensive multidisciplinary research platform for the Earth Sciences in Europe and abroad

    Playing ” OGC SensorThings API Part 1 : Sensing ” with several French research organizations and one research infrastructure

    No full text
    International audienceOne of the major goals of the European Long-Term Ecological Research (eLTER) and the up-coming eLTER Research Infrastructure (eLTER RI) is to provide reliable and quality-controlled long-term data for scientific analysis as well as the assessment of environmental policy impacts. For this purpose, eLTER has designed, implemented and operates a federated data infrastruc-ture called the eLTER Information System. This e-infrastructure offers data stored in existing partner data systems, harmonised by a central discovery portal and federated data access com-ponents providing a common information management infrastructure for making environmental data available from distributed resources provided by the contributing LTER national networks. Designing, building and optimising such a pan-European environmental data infrastructure is a lengthy and complex process that is based on a set of criteria defined by user needs, share-holder requirements and general service and technology best practises. To further improve and extend the eLTER Information System, user needs have recently been collected by (a) targeted interviews with selected stakeholders to identify the scope and background of the data and ICT requirements, (b) workshops mapping user requirements based on personas derived from the interviews, and (c) analysis work on extracting so-called user stories. The requirements collec-tions are used to derive functional (i.e. the behaviour of essential features of the system) and non-functional (i.e. the general characteristics of the system) requirements for the IT infrastruc-ture and services. These collected requirements revolve around the development of workflows for the ingestion, curation and publication of data objects including the creation, harvesting, discovery and visualisation of metadata as well as providing means to support the analysis of these datasets and communicating study results.This presentation will provide an overview of the current stage of the data infrastructure as well as its major components, provide an outlook for future developments and discuss the technical and scientific challenges of building the eLTER Information System

    Playing ” OGC SensorThings API Part 1 : Sensing ” with several French research organizations and one research infrastructure

    No full text
    International audienceOne of the major goals of the European Long-Term Ecological Research (eLTER) and the up-coming eLTER Research Infrastructure (eLTER RI) is to provide reliable and quality-controlled long-term data for scientific analysis as well as the assessment of environmental policy impacts. For this purpose, eLTER has designed, implemented and operates a federated data infrastruc-ture called the eLTER Information System. This e-infrastructure offers data stored in existing partner data systems, harmonised by a central discovery portal and federated data access com-ponents providing a common information management infrastructure for making environmental data available from distributed resources provided by the contributing LTER national networks. Designing, building and optimising such a pan-European environmental data infrastructure is a lengthy and complex process that is based on a set of criteria defined by user needs, share-holder requirements and general service and technology best practises. To further improve and extend the eLTER Information System, user needs have recently been collected by (a) targeted interviews with selected stakeholders to identify the scope and background of the data and ICT requirements, (b) workshops mapping user requirements based on personas derived from the interviews, and (c) analysis work on extracting so-called user stories. The requirements collec-tions are used to derive functional (i.e. the behaviour of essential features of the system) and non-functional (i.e. the general characteristics of the system) requirements for the IT infrastruc-ture and services. These collected requirements revolve around the development of workflows for the ingestion, curation and publication of data objects including the creation, harvesting, discovery and visualisation of metadata as well as providing means to support the analysis of these datasets and communicating study results.This presentation will provide an overview of the current stage of the data infrastructure as well as its major components, provide an outlook for future developments and discuss the technical and scientific challenges of building the eLTER Information System

    More Practical INSPIRE Practice

    No full text
    International audienc
    corecore