5,688 research outputs found

    Computational Ontologies and Information Systems II: Formal Specification

    Get PDF
    This paper extends the study of ontologies in Part I of this study (Volume 14, Article 8) in the context of Information Systems. The basic foundations of computational ontologies presented in Part I are extended to formal specifications in this paper. This paper provides a review of the formalisms, languages, and tools for specifying and implementing computational ontologies Directions for future research are also provided

    BIM semantic-enrichment for built heritage representation

    Get PDF
    In the built heritage context, BIM has shown difficulties in representing and managing the large and complex knowledge related to non-geometrical aspects of the heritage. Within this scope, this paper focuses on a domain-specific semantic-enrichment of BIM methodology, aimed at fulfilling semantic representation requirements of built heritage through Semantic Web technologies. To develop this semantic-enriched BIM approach, this research relies on the integration of a BIM environment with a knowledge base created through information ontologies. The result is knowledge base system - and a prototypal platform - that enhances semantic representation capabilities of BIM application to architectural heritage processes. It solves the issue of knowledge formalization in cultural heritage informative models, favouring a deeper comprehension and interpretation of all the building aspects. Its open structure allows future research to customize, scale and adapt the knowledge base different typologies of artefacts and heritage activities

    State-of-the-art on evolution and reactivity

    Get PDF
    This report starts by, in Chapter 1, outlining aspects of querying and updating resources on the Web and on the Semantic Web, including the development of query and update languages to be carried out within the Rewerse project. From this outline, it becomes clear that several existing research areas and topics are of interest for this work in Rewerse. In the remainder of this report we further present state of the art surveys in a selection of such areas and topics. More precisely: in Chapter 2 we give an overview of logics for reasoning about state change and updates; Chapter 3 is devoted to briefly describing existing update languages for the Web, and also for updating logic programs; in Chapter 4 event-condition-action rules, both in the context of active database systems and in the context of semistructured data, are surveyed; in Chapter 5 we give an overview of some relevant rule-based agents frameworks

    D5.2: Digital-Twin Enabled multi-physics simulation and model matching

    Get PDF
    This deliverable presents a report on the developed actions and results concerning Digital-Twin-enabled multi-physics simulations and model matching. Enabling meaningful simulations within new human-infrastructure interfaces such as Digital twins is paramount. Accessing the power of simulation opens manifold new ways for observation, understanding, analysis and prediction of numerous scenarios to which the asset may be faced. As a result, managers can access countless ways of acquiring synthetic data for eventually taking better, more informed decisions. The tool MatchFEM is conceived as a fundamental part of this endeavour. From a broad perspective, the tool is aimed at contextualizing information between multi-physics simulations and vaster information constructs such as digital twins. 3D geometries, measurements, simulations, and asset management coexist in such information constructs. This report provides guidance for the generation of comprehensive adequate initial conditions of the assets to be used during their life span using a DT basis. From a more specific focus, this deliverable presents a set of exemplary recommendations for the development of DT-enabled load tests of assets in the form of a white paper. The deliverable also belongs to a vaster suit of documents encountered in WP5 of the Ashvin project in which measurements, models and assessments are described thoroughly.Objectius de Desenvolupament Sostenible::9 - IndĂșstria, InnovaciĂł i InfraestructuraPreprin

    A posteriori metadata from automated provenance tracking: Integration of AiiDA and TCOD

    Full text link
    In order to make results of computational scientific research findable, accessible, interoperable and re-usable, it is necessary to decorate them with standardised metadata. However, there are a number of technical and practical challenges that make this process difficult to achieve in practice. Here the implementation of a protocol is presented to tag crystal structures with their computed properties, without the need of human intervention to curate the data. This protocol leverages the capabilities of AiiDA, an open-source platform to manage and automate scientific computational workflows, and TCOD, an open-access database storing computed materials properties using a well-defined and exhaustive ontology. Based on these, the complete procedure to deposit computed data in the TCOD database is automated. All relevant metadata are extracted from the full provenance information that AiiDA tracks and stores automatically while managing the calculations. Such a protocol also enables reproducibility of scientific data in the field of computational materials science. As a proof of concept, the AiiDA-TCOD interface is used to deposit 170 theoretical structures together with their computed properties and their full provenance graphs, consisting in over 4600 AiiDA nodes

    Improving knowledge about the risks of inappropriate uses of geospatial data by introducing a collaborative approach in the design of geospatial databases

    Get PDF
    La disponibilitĂ© accrue de l’information gĂ©ospatiale est, de nos jours, une rĂ©alitĂ© que plusieurs organisations, et mĂȘme le grand public, tentent de rentabiliser; la possibilitĂ© de rĂ©utilisation des jeux de donnĂ©es est dĂ©sormais une alternative envisageable par les organisations compte tenu des Ă©conomies de coĂ»ts qui en rĂ©sulteraient. La qualitĂ© de donnĂ©es de ces jeux de donnĂ©es peut ĂȘtre variable et discutable selon le contexte d’utilisation. L’enjeu d’inadĂ©quation Ă  l’utilisation de ces donnĂ©es devient d’autant plus important lorsqu’il y a disparitĂ© entre les nombreuses expertises des utilisateurs finaux de la donnĂ©e gĂ©ospatiale. La gestion des risques d’usages inappropriĂ©s de l’information gĂ©ospatiale a fait l’objet de plusieurs recherches au cours des quinze derniĂšres annĂ©es. Dans ce contexte, plusieurs approches ont Ă©tĂ© proposĂ©es pour traiter ces risques : parmi ces approches, certaines sont prĂ©ventives et d’autres sont plutĂŽt palliatives et gĂšrent le risque aprĂšs l'occurrence de ses consĂ©quences; nĂ©anmoins, ces approches sont souvent basĂ©es sur des initiatives ad-hoc non systĂ©miques. Ainsi, pendant le processus de conception de la base de donnĂ©es gĂ©ospatiale, l’analyse de risque n’est pas toujours effectuĂ©e conformĂ©ment aux principes d’ingĂ©nierie des exigences (Requirements Engineering) ni aux orientations et recommandations des normes et standards ISO. Dans cette thĂšse, nous Ă©mettons l'hypothĂšse qu’il est possible de dĂ©finir une nouvelle approche prĂ©ventive pour l’identification et l’analyse des risques liĂ©s Ă  des usages inappropriĂ©s de la donnĂ©e gĂ©ospatiale. Nous pensons que l’expertise et la connaissance dĂ©tenues par les experts (i.e. experts en geoTI), ainsi que par les utilisateurs professionnels de la donnĂ©e gĂ©ospatiale dans le cadre institutionnel de leurs fonctions (i.e. experts du domaine d'application), constituent un Ă©lĂ©ment clĂ© dans l’évaluation des risques liĂ©s aux usages inadĂ©quats de ladite donnĂ©e, d’oĂč l’importance d’enrichir cette connaissance. Ainsi, nous passons en revue le processus de conception des bases de donnĂ©es gĂ©ospatiales et proposons une approche collaborative d’analyse des exigences axĂ©e sur l’utilisateur. Dans le cadre de cette approche, l’utilisateur expert et professionnel est impliquĂ© dans un processus collaboratif favorisant l’identification a priori des cas d’usages inappropriĂ©s. Ensuite, en passant en revue la recherche en analyse de risques, nous proposons une intĂ©gration systĂ©mique du processus d’analyse de risque au processus de la conception de bases de donnĂ©es gĂ©ospatiales et ce, via la technique Delphi. Finalement, toujours dans le cadre d’une approche collaborative, un rĂ©fĂ©rentiel ontologique de risque est proposĂ© pour enrichir les connaissances sur les risques et pour diffuser cette connaissance aux concepteurs et utilisateurs finaux. L’approche est implantĂ©e sous une plateforme web pour mettre en Ɠuvre les concepts et montrer sa faisabilitĂ©.Nowadays, the increased availability of geospatial information is a reality that many organizations, and even the general public, are trying to transform to a financial benefit. The reusability of datasets is now a viable alternative that may help organizations to achieve cost savings. The quality of these datasets may vary depending on the usage context. The issue of geospatial data misuse becomes even more important because of the disparity between the different expertises of the geospatial data end-users. Managing the risks of geospatial data misuse has been the subject of several studies over the past fifteen years. In this context, several approaches have been proposed to address these risks, namely preventive approaches and palliative approaches. However, these approaches are often based on ad-hoc initiatives. Thus, during the design process of the geospatial database, risk analysis is not always carried out in accordance neither with the principles/guidelines of requirements engineering nor with the recommendations of ISO standards. In this thesis, we suppose that it is possible to define a preventive approach for the identification and analysis of risks associated to inappropriate use of geospatial data. We believe that the expertise and knowledge held by experts and users of geospatial data are key elements for the assessment of risks of geospatial data misuse of this data. Hence, it becomes important to enrich that knowledge. Thus, we review the geospatial data design process and propose a collaborative and user-centric approach for requirements analysis. Under this approach, the user is involved in a collaborative process that helps provide an a priori identification of inappropriate use of the underlying data. Then, by reviewing research in the domain of risk analysis, we propose to systematically integrate risk analysis – using the Delphi technique – through the design of geospatial databases. Finally, still in the context of a collaborative approach, an ontological risk repository is proposed to enrich the knowledge about the risks of data misuse and to disseminate this knowledge to the design team, developers and end-users. The approach is then implemented using a web platform in order to demonstrate its feasibility and to get the concepts working within a concrete prototype

    An Editorial Workflow Approach For Collaborative Ontology Development

    Get PDF
    The widespread use of ontologies in the last years has raised new challenges for their development and maintenance. Ontology development has transformed from a process normally performed by one ontology engineer into a process performed collaboratively by a team of ontology engineers, who may be geographically distributed and play different roles. For example, editors may propose changes, while authoritative users approve or reject them following a well defined process. This process, however, has only been partially addressed by existing ontology development methods, methodologies, and tool support. Furthermore, in a distributed environment where ontology editors may be working on local copies of the same ontology, strategies should be in place to ensure that changes in one copy are reflected in all of them. In this paper, we propose a workflow-based model for the collaborative development of ontologies in distributed environments and describe the components required to support them. We illustrate our model with a test case in the fishery domain from the United Nations Food and Agriculture Organisation (FAO)
    • 

    corecore