22 research outputs found

    EU-Raw Materials Intelligence Capacity Platform (EU-RMCP) – Technical system specification

    Get PDF
    EU-Raw Materials Intelligence Capacity Platform (or EU-RMICP) integrates metadata on data sources related to primary and secondary mineral resources and brings the end users an expertise on the methods and tools used in mineral intelligence. The system is capable of bringing relevant user ‘answers’ of the type 'how to proceed for …' on almost any question related to mineral resources, on the whole supply chain, from prospecting to recycling, taking into account the environmental, political and social dimensions. EU-RMICP is based on an ontology of the domain of mineral resources (coupled with more generic cross-functional ontologies, relative to commodities, time and space), which represents the domain of the questions of the users (experts and non-experts). The user navigates in the ontology by using a Dynamic Graph of Decision (DDG), which allows him/her to discover the solutions which he/she is looking for without having to formulate any question. The system is coupled with a 'RDF Triple Store' (a database storing the ontologies), factSheets, doc-Sheets and flowSheets (i.e., specific formatted forms) related to methods and documentation, scenarios and metadata.JRC.B.6-Digital Econom

    Modelling Inventory and Knowledge Management System of the European Commission (MIDAS)

    Get PDF
    The Modelling Inventory and Knowledge Management System of the European Commission (MIDAS) is a Commission-wide knowledge management tool for modelling, enabling enhanced transparency and traceability of models in use for EC policy making. It forms an integral part of the Competence Centre on Modelling (CC-MOD) of the Joint Research Centre of the European Commission (JRC). This document describes MIDAS, by providing a bird's-eye view of the MIDAS content, architecture, and functionality, and identifying the benefits of the system for the organisation and in the context of the Better Regulation Agenda.JRC.I.2-Foresight, Modelling, Behavioural Insights & Design for Polic

    Semantic Text Analysis tool: SeTA

    Get PDF
    An ever-growing number and length of documents, number and depth of topics covered by legislation, and ever new phrases and their slowly changing meaning, these are all contributing factors that make policy analysis more and more complex. As implication, human policy analysts and policy developers face increasing entanglement of both content and semantical levels. To overcome several of these issues, JRC has developed a central pilot tool called AI-KEAPA to support policy analysis and development in any domain. Recent developments in big data, machine learning and especially in natural language processing allow converting unfathomable complexity of many hundreds of thousands of documents into a normalised high-dimensional vector space preserving the knowledge. Unstructured text in document corpora and big data sources, until recently considered just an archive, is quickly becoming core source of analytical information using text mining methods to extract qualitative and quantitative data. Semantic analysis allows us to extract better information for policy analysis from metadata titles and abstracts than from the structured human-entered descriptions. This digital assistant allows document search and extraction over many different sources, discovery of phrase meaning, context and temporal development. It can recommend most relevant documents including their semantic and temporal interdependencies. But most importantly, it helps bursting knowledge bubbles and fast-learning new domains. This way we hope to mainstream artificial intelligence into policy support. The tool is now fit for purpose. It was thoroughly tested in real-life conditions for about two years mainly in the area of legislative impact assessments for policy formulation, and other domains such as large data infrastructure analysis, agri-environmental measures or natural disasters, some of which are detailed in this document. This approach boosts the strategic JRC focus on application of scientific analysis and development. This service adds to the JRC competence and central position in semantic reasoning for policy analysis, active information recommendation, and inferred knowledge in policy design and development.JRC.I.3-Text and Data Minin

    Describing models in context – A step towards enhanced transparency of scientific processes underpinning policy making

    Get PDF
    The transparency and reproducibility of scientific evidence underpinning policy is crucial to build and retain trust. This paper describes an application that takes a significant step towards enhanced transparency of scientific models used for policy making: The Modelling Inventory Database and Access Services (MIDAS) developed by the Joint Research Centre (JRC) describes models in use by the JRC in their scientific context by linking them to other models, to related data, to supported policies and to domain experts. To effectively share the resulting knowledge across different domains and with policy makers within the institution MIDAS represents the resulting complex network of relations and entities through visual aids based on visual analytics and data narratives. This paper describes not just the application in order to contribute to emerging dialogue on best practice for model documentation, it describes the process and main challenges we met with, and the approach taken to overcome the

    JRC Data Policy

    Get PDF
    The work on the JRC Data Policy followed the task identified in the JRC Management Plan 2014 to develop a dedicated data policy to complement the JRC Policy on Open Access to Scientific Publications and Supporting Guidance, and to promote open access to research data in the context of Horizon 2020. Important policy commitments and the relevant regulatory basis within the European Union and the European Commission include: the Commission Decision on the reuse of Commission documents, Commission communication on better access to scientific information, Commission communication on a reinforced European research area partnership for excellence and growth, Commission recommendation on access to and preservation of scientific information, and the EU implementation of the G8 Open Data Charter.JRC.H.6-Digital Earth and Reference Dat

    Assessing Climate Change Vulnerability in the Arctic Using Geographic Information Services in Spatial Data Infrastructures

    Full text link
    Spatial Data Infrastructures (SDI) are emerging on regional, national and international levels as internet based information infrastructures to allow the exchange and use of distributed geoinformation. Application areas that require managing information that is distributed cross-institutional and cross-border, such as environmental protection, hazard and risk prevention, are expected to highly benefit from the SDI developments. However, only a few attempts have been undertaken to use SDI for climate change assessment. Based on a generic methodology for the quantification of vulnerability to climate change this paper presents a concept, a software architecture and a prototypical implementation of an interdisciplinary SDI to provide scientists and various levels of stakeholders with tools to communicate, assess and enhance information about vulnerability to climate change.JRC.H.6-Spatial data infrastructure

    Web-based Assessment and Decision Support Technology

    Full text link
    This paper presents and discusses an approach for web-based assessment and decision support using multi-criteria evaluation methodology to be integrated into an interoperable service infrastructure. The research is imbedded in the EU-funded BALANCE project, which tries to assess possible impact and vulnerability to climate change. The results will be incorporated into an Assessment and Decision Support System (ADSS) for the arctic, which shall raise awareness among stakeholders in the BALANCE study area about possible climate change impacts on their environment and way of live and support decisions concerning possible adaptation and mitigation strategies. The paper elucidates the concept of geoprocessing and service chaining with the help of a use case in the field of reindeer herding.JRC.H.7-Land management and natural hazard

    Seeing the forest through the trees: A review of integrated environmental modelling tools

    Full text link
    Today’s interconnected socio-economic and environmental challenges require the combination and reuse of existing integrated modelling solutions. This paper contributes to this overall research area, by reviewing a wide range of currently available frameworks, systems and emerging technologies for integrated modelling in the environmental sciences. Based on a systematic review of the literature, we group related studies and papers into viewpoints and elaborate on shared and diverging characteristics. Our analysis shows that component-based modelling frameworks and scientific workflow systems have been traditionally used for solving technical integration challenges, but ultimately, the appropriate framework or system strongly depends on the particular environmental phenomenon under investigation. The study also shows that – in general – individual integrated modelling solutions do not benefit from components and models that are provided by others. It is this island (or silo) situation, which results in low levels of model reuse for multi-disciplinary settings. This seems mainly due to the fact that the field as such is highly complex and diverse. A unique integrated modelling solution, which is capable of dealing with any environmental scenario, seems to be unaffordable because of the great variety of data formats, models, environmental phenomena, stakeholder networks, user perspectives and social aspects. Nevertheless, we conclude that the combination of modelling tools, which address complementary viewpoints – such as service-based combined with scientific workflow systems, or resource-modelling on top of virtual research environments – could lead to sustainable information systems, which would advance model sharing, reuse and integration. Next steps for improving this form of multi-disciplinary interoperability are sketched.JRC.H.6-Digital Earth and Reference Dat

    Geoprozessierung in Geodateninfrastrukturen - Die nächste Generation

    Full text link
    Today, spatial data infrastructures consist mainly of GI services that offer base functionalities such as searching, accessing and visualising geographic data. While this makes it easier to use geographic data, GI services cannot yet be combined to provide analyses based on specific user requirements. For this, additional GI services and mechanisms for orchestrating them are required. Standards in this area are currently under development, e.g. the OGC discussion paper on Web Processing Services (WPS). In this paper, we present the design and implementation of a geo-processing service based on the WPS specification. In the conclusion, we highlight the problems encountered as well as open research questions and possible next steps in the area of distributed geoprocessing.JRC.H.6-Spatial data infrastructure

    SOCRATES Manual

    Full text link
    SOCRATES (SOcial multi-CRiteria AssessmenT of European policieS) is a software tool explicitly designed for impact assessment problems. Three main components constitute the core of SOCRATES: multi-criteria, equity and sensitivity analyses. The impact matrix may include quantitative (including also stochastic and/or fuzzy uncertainty) and qualitative (ordinal and/or linguistic) measurements of the performance of an alternative with respect to an evaluation criterion. It supplies a ranking of the alternatives according to the set of evaluation criteria by using a non-compensatory mathematical aggregation rule. Equity analysis requires as input a set of social actors and the impact of the alternatives on these social actors. The objective of sensitivity analysis is to check if the ranking provided is stable and to determine which of the input parameters influence more the model output. The entire information produced by local and global sensitivity analyses is synthesised into simple graphics
    corecore