2,871 research outputs found

    A Linked Data Approach to Sharing Workflows and Workflow Results

    No full text
    A bioinformatics analysis pipeline is often highly elaborate, due to the inherent complexity of biological systems and the variety and size of datasets. A digital equivalent of the ‘Materials and Methods’ section in wet laboratory publications would be highly beneficial to bioinformatics, for evaluating evidence and examining data across related experiments, while introducing the potential to find associated resources and integrate them as data and services. We present initial steps towards preserving bioinformatics ‘materials and methods’ by exploiting the workflow paradigm for capturing the design of a data analysis pipeline, and RDF to link the workflow, its component services, run-time provenance, and a personalized biological interpretation of the results. An example shows the reproduction of the unique graph of an analysis procedure, its results, provenance, and personal interpretation of a text mining experiment. It links data from Taverna, myExperiment.org, BioCatalogue.org, and ConceptWiki.org. The approach is relatively ‘light-weight’ and unobtrusive to bioinformatics users

    Semantic interoperability: ontological unpacking of a viral conceptual model

    Get PDF
    Background. Genomics and virology are unquestionably important, but complex, domains being investigated by a large number of scientists. The need to facilitate and support work within these domains requires sharing of databases, although it is often difficult to do so because of the different ways in which data is represented across the databases. To foster semantic interoperability, models are needed that provide a deep understanding and interpretation of the concepts in a domain, so that the data can be consistently interpreted among researchers. Results. In this research, we propose the use of conceptual models to support semantic interoperability among databases and assess their ontological clarity to support their effective use. This modeling effort is illustrated by its application to the Viral Conceptual Model (VCM) that captures and represents the sequencing of viruses, inspired by the need to understand the genomic aspects of the virus responsible for COVID-19. For achieving semantic clarity on the VCM, we leverage the “ontological unpacking” method, a process of ontological analysis that reveals the ontological foundation of the information that is represented in a conceptual model. This is accomplished by applying the stereotypes of the OntoUML ontology-driven conceptual modeling language.As a result, we propose a new OntoVCM, an ontologically grounded model, based on the initial VCM, but with guaranteed interoperability among the data sources that employ it. Conclusions. We propose and illustrate how the unpacking of the Viral Conceptual Model resolves several issues related to semantic interoperability, the importance of which is recognized by the “I” in FAIR principles. The research addresses conceptual uncertainty within the domain of SARS-CoV-2 data and knowledge.The method employed provides the basis for further analyses of complex models currently used in life science applications, but lacking ontological grounding, subsequently hindering the interoperability needed for scientists to progress their research

    European standardization efforts from FAIR toward explainable-AI-ready data documentation in materials modelling

    Get PDF
    Security critical AI applications require a standardized and interoperable data and metadata documentation that makes the source data explainable-AI ready (XAIR). Within the domain of materials modelling and characterization, European initiatives have proposed a series of metadata standards and procedural recommendations that were accepted as CEN workshop agreements (CWAs): CWA 17284 MODA, CWA 17815 CHADA, and CWA 17960 ModGra. It is discussed how these standards have been ontologized, and gaps are identified as regards the epistemic grounding metadata, i.e., an annotation of data and claims by something that substantiates whether, why, and to what extent they are indeed knowledge and can be relied upon.European standardization efforts from FAIR toward explainable-AI-ready data documentation in materials modellingsubmittedVersio

    Toward an Ontology for Third Generation Systems Thinking

    Full text link
    Systems thinking is a way of making sense about the world in terms of multilevel, nested, interacting systems, their environment, and the boundaries between the systems and the environment. In this paper we discuss the evolution of systems thinking and discuss what is needed for an ontology of the current generation of systems thinking

    Extending and encoding existing biological terminologies and datasets for use in the reasoned semantic web

    Full text link
    • 

    corecore