126,701 research outputs found

    From Affective Science to Psychiatric Disorder: Ontology as Semantic Bridge

    Get PDF
    Advances in emotion and affective science have yet to translate routinely into psychiatric research and practice. This is unfortunate since emotion and affect are fundamental components of many psychiatric conditions. Rectifying this lack of interdisciplinary integration could thus be a potential avenue for improving psychiatric diagnosis and treatment. In this contribution, we propose and discuss an ontological framework for explicitly capturing the complex interrelations between affective entities and psychiatric disorders, in order to facilitate mapping and integration between affective science and psychiatric diagnostics. We build on and enhance the categorisation of emotion, affect and mood within the previously developed Emotion Ontology, and that of psychiatric disorders in the Mental Disease Ontology. This effort further draws on developments in formal ontology regarding the distinction between normal and abnormal in order to formalize the interconnections. This operational semantic framework is relevant for applications including clarifying psychiatric diagnostic categories, clinical information systems, and the integration and translation of research results across disciplines

    Semantically intelligent semi-automated ontology integration

    Get PDF
    An ontology is a way of information categorization and storage. Web Ontologies provide help in retrieving the required and precise information over the web. However, the problem of heterogeneity between ontologies may occur in the use of multiple ontologies of the same domain. The integration of ontologies provides a solution for the heterogeneity problem. Ontology integration is a solution to problem of interoperability in the knowledge based systems. Ontology integration provides a mechanism to find the semantic association between a pair of reference ontologies based on their concepts. Many researchers have been working on the problem of ontology integration; however, multiple issues related to ontology integration are still not addressed. This dissertation involves the investigation of the ontology integration problem and proposes a layer based enhanced framework as a solution to the problem. The comparison between concepts of reference ontologies is based on their semantics along with their syntax in the concept matching process of ontology integration. The semantic relationship of a concept with other concepts between ontologies and the provision of user confirmation (only for the problematic cases) are also taken into account in this process. The proposed framework is implemented and validated by providing a comparison of the proposed concept matching technique with the existing techniques. The test case scenarios are provided in order to compare and analyse the proposed framework in the analysis phase. The results of the experiments completed demonstrate the efficacy and success of the proposed framework

    LoLa: a modular ontology of logics, languages and translations

    Get PDF
    The Distributed Ontology Language (DOL), currently being standardised within the OntoIOp (Ontology Integration and Interoperability) activity of ISO/TC 37/SC 3, aims at providing a unified framework for (i) ontologies formalised in heterogeneous logics, (ii) modular ontologies, (iii) links between ontologies, and (iv) annotation of ontologies.\ud \ud This paper focuses on the LoLa ontology, which formally describes DOL's vocabulary for logics, ontology languages (and their serialisations), as well as logic translations. Interestingly, to adequately formalise the logical relationships between these notions, LoLa itself needs to be axiomatised heterogeneously---a task for which we choose DOL. Namely, we use the logic RDF for ABox assertions, OWL for basic axiomatisations of various modules concerning logics, languages, and translations, FOL for capturing certain closure rules that are not expressible in OWL (For the sake of tool availability it is still helpful not to map everything to FOL.), and circumscription for minimising the extension of concepts describing default translations

    The Distributed Ontology Language (DOL): Use Cases, Syntax, and Extensibility

    Full text link
    The Distributed Ontology Language (DOL) is currently being standardized within the OntoIOp (Ontology Integration and Interoperability) activity of ISO/TC 37/SC 3. It aims at providing a unified framework for (1) ontologies formalized in heterogeneous logics, (2) modular ontologies, (3) links between ontologies, and (4) annotation of ontologies. This paper presents the current state of DOL's standardization. It focuses on use cases where distributed ontologies enable interoperability and reusability. We demonstrate relevant features of the DOL syntax and semantics and explain how these integrate into existing knowledge engineering environments.Comment: Terminology and Knowledge Engineering Conference (TKE) 2012-06-20 to 2012-06-21 Madrid, Spai

    A pattern-based approach to a cell tracking ontology

    No full text
    Time-lapse microscopy has thoroughly transformed our understanding of biological motion and developmental dynamics from single cells to entire organisms. The increasing amount of cell tracking data demands the creation of tools to make extracted data searchable and interoperable between experiment and data types. In order to address that problem, the current paper reports on the progress in building the Cell Tracking Ontology (CTO): An ontology framework for describing, querying and integrating data from complementary experimental techniques in the domain of cell tracking experiments. CTO is based on a basic knowledge structure: the cellular genealogy serving as a backbone model to integrate specific biological ontologies into tracking data. As a first step we integrate the Phenotype and Trait Ontology (PATO) as one of the most relevant ontologies to annotate cell tracking experiments. The CTO requires both the integration of data on various levels of generality as well as the proper structuring of collected information. Therefore, in order to provide a sound foundation of the ontology, we have built on the rich body of work on top-level ontologies and established three generic ontology design patterns addressing three modeling challenges for properly representing cellular genealogies, i.e. representing entities existing in time, undergoing changes over time and their organization into more complex structures such as situations

    TGF-beta signaling proteins and the Protein Ontology

    Get PDF
    The Protein Ontology (PRO) is designed as a formal and principled Open Biomedical Ontologies (OBO) Foundry ontology for proteins. The components of PRO extend from a classification of proteins on the basis of evolutionary relationships at the homeomorphic level to the representation of the multiple protein forms of a gene, including those resulting from alternative splicing, cleavage and/or posttranslational modifications. Focusing specifically on the TGF-beta signaling proteins, we describe the building, curation, usage and dissemination of PRO. PRO provides a framework for the formal representation of protein classes and protein forms in the OBO Foundry. It is designed to enable data retrieval and integration and machine reasoning at the molecular level of proteins, thereby facilitating cross-species comparisons, pathway analysis, disease modeling and the generation of new hypotheses

    Context modeling and constraints binding in web service business processes

    Get PDF
    Context awareness is a principle used in pervasive services applications to enhance their exibility and adaptability to changing conditions and dynamic environments. Ontologies provide a suitable framework for context modeling and reasoning. We develop a context model for executable business processes { captured as an ontology for the web services domain. A web service description is attached to a service context profile, which is bound to the context ontology. Context instances can be generated dynamically at services runtime and are bound to context constraint services. Constraint services facilitate both setting up constraint properties and constraint checkers, which determine the dynamic validity of context instances. Data collectors focus on capturing context instances. Runtime integration of both constraint services and data collectors permit the business process to achieve dynamic business goals
    corecore