176 research outputs found

    Formal ontology for biomedical knowledge systems integration

    Get PDF
    The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology will greatly benefit software application ontologies. To this end LinKBase®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this, we aim to move beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications

    Ontology-assisted database integration to support natural language processing and biomedical data-mining

    Get PDF
    Successful biomedical data mining and information extraction require a complete picture of biological phenomena such as genes, biological processes, and diseases; as these exist on different levels of granularity. To realize this goal, several freely available heterogeneous databases as well as proprietary structured datasets have to be integrated into a single global customizable scheme. We will present a tool to integrate different biological data sources by mapping them to a proprietary biomedical ontology that has been developed for the purposes of making computers understand medical natural language

    Using ontology in query answering systems: Scenarios, requirements and challenges

    Get PDF
    Equipped with the ultimate query answering system, computers would finally be in a position to address all our information needs in a natural way. In this paper, we describe how Language and Computing nv (L&C), a developer of ontology-based natural language understanding systems for the healthcare domain, is working towards the ultimate Question Answering (QA) System for healthcare workers. L&C’s company strategy in this area is to design in a step-by-step fashion the essential components of such a system, each component being designed to solve some one part of the total problem and at the same time reflect well-defined needs on the prat of our customers. We compare our strategy with the research roadmap proposed by the Question Answering Committee of the National Institute of Standards and Technology (NIST), paying special attention to the role of ontology

    Ontological theory for ontological engineering: Biomedical systems information integration

    Get PDF
    Software application ontologies have the potential to become the keystone in state-of-the-art information management techniques. It is expected that these ontologies will support the sort of reasoning power required to navigate large and complex terminologies correctly and efficiently. Yet, there is one problem in particular that continues to stand in our way. As these terminological structures increase in size and complexity, and the drive to integrate them inevitably swells, it is clear that the level of consistency required for such navigation will become correspondingly difficult to maintain. While descriptive semantic representations are certainly a necessary component to any adequate ontology-based system, so long as ontology engineers rely solely on semantic information, without a sound ontological theory informing their modeling decisions, this goal will surely remain out of reach. In this paper we describe how Language and Computing nv (L&C), along with The Institute for Formal Ontology and Medical Information Sciences (IFOMIS), are working towards developing and implementing just such a theory, combining the open software architecture of L&C’s LinkSuiteTM with the philosophical rigor of IFOMIS’s Basic Formal Ontology. In this way we aim to move beyond the more or less simple controlled vocabularies that have dominated the industry to date

    Change Management: The Core Task of Ontology Versioning and Evolution

    No full text
    Change management as a key issue in ontology versioning and evolution is still not fully addressed, which to some extent forms a barrier against the smooth process of ontology evolution. The key issue in the support of evolving ontologies is to distinguish and recognize the changes during the process of ontology evolution. Most of the current popular work on ontology versioning do not keep a record of the changes in the ontology, thus preventing the user from tracking those changes back and forward, or to at least understand the rational behind those changes. We are proposing an approach to get the evidences of ontology changes, keep track of them, and manage them in an engineering fashion

    A Workflow for the Networked Ontologies Lifecycle. A Case Study in FAO of the UN

    Get PDF
    This document shows a preliminary framework for editing networked ontologies in the context of the NeOn project. The goal is to manage, in a collaborative way, multiple networked ontologies for large-scale semantic applications. This paper shows the main concepts on the editorial workflow and several lifecycle use cases. The ontologies produced with this framework will be used by the Food and Agriculture Organization of the United Nations (FAO) in many different large applications such the Fisheries Stock Depletion Assessment System[4]. Therefore a major goal for FAO is to have a strong and reliable ontology management system for editing the networked ontologies that applications will use as a basis. This framework for editing networked ontologies is being developed in the context of the NeOn Project1. What we present here is a brief summary of the activities carried out in this project regarding user requirements and subsequent use case analysis

    Trust and Privacy Solutions Based on Holistic Service Requirements

    Get PDF
    The products and services designed for Smart Cities provide the necessary tools to improve the management of modern cities in a more efficient way. These tools need to gather citizens’ information about their activity, preferences, habits, etc. opening up the possibility of tracking them. Thus, privacy and security policies must be developed in order to satisfy and manage the legislative heterogeneity surrounding the services provided and comply with the laws of the country where they are provided. This paper presents one of the possible solutions to manage this heterogeneity, bearing in mind these types of networks, such as Wireless Sensor Networks, have important resource limitations. A knowledge and ontology management system is proposed to facilitate the collaboration between the business, legal and technological areas. This will ease the implementation of adequate specific security and privacy policies for a given service. All these security and privacy policies are based on the information provided by the deployed platforms and by expert system processing

    Using cross-lingual information to cope with underspecification in formal ontologies

    Get PDF
    Description logics and other formal devices are frequently used as means for preventing or detecting mistakes in ontologies. Some of these devices are also capable of inferring the existence of inter-concept relationships that have not been explicitly entered into an ontology. A prerequisite, however, is that this information can be derived from those formal definitions of concepts and relationships which are included within the ontology. In this paper, we present a novel algorithm that is able to suggest relationships among existing concepts in a formal ontology that are not derivable from such formal definitions. The algorithm exploits cross-lingual information that is implicitly present in the collection of terms used in various languages to denote the concepts and relationships at issue. By using a specific experimental design, we are able to quantify the impact of cross-lingual information in coping with underspecification in formal ontologies

    Context-Aware Information Retrieval for Enhanced Situation Awareness

    No full text
    In the coalition forces, users are increasingly challenged with the issues of information overload and correlation of information from heterogeneous sources. Users might need different pieces of information, ranging from information about a single building, to the resolution strategy of a global conflict. Sometimes, the time, location and past history of information access can also shape the information needs of users. Information systems need to help users pull together data from disparate sources according to their expressed needs (as represented by system queries), as well as less specific criteria. Information consumers have varying roles, tasks/missions, goals and agendas, knowledge and background, and personal preferences. These factors can be used to shape both the execution of user queries and the form in which retrieved information is packaged. However, full automation of this daunting information aggregation and customization task is not possible with existing approaches. In this paper we present an infrastructure for context-aware information retrieval to enhance situation awareness. The infrastructure provides each user with a customized, mission-oriented system that gives access to the right information from heterogeneous sources in the context of a particular task, plan and/or mission. The approach lays on five intertwined fundamental concepts, namely Workflow, Context, Ontology, Profile and Information Aggregation. The exploitation of this knowledge, using appropriate domain ontologies, will make it feasible to provide contextual assistance in various ways to the work performed according to a user’s taskrelevant information requirements. This paper formalizes these concepts and their interrelationships

    Using philosophy to improve the coherence and interoperability of applications ontologies: A field report on the collaboration of IFOMIS and L&C

    Get PDF
    The collaboration of Language and Computing nv (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is guided by the hypothesis that quality constraints on ontologies for software ap-plication purposes closely parallel the constraints salient to the design of sound philosophical theories. The extent of this parallel has been poorly appreciated in the informatics community, and it turns out that importing the benefits of phi-losophical insight and methodology into application domains yields a variety of improvements. L&C’s LinKBase® is one of the world’s largest medical domain ontologies. Its current primary use pertains to natural language processing ap-plications, but it also supports intelligent navigation through a range of struc-tured medical and bioinformatics information resources, such as SNOMED-CT, Swiss-Prot, and the Gene Ontology (GO). In this report we discuss how and why philosophical methods improve both the internal coherence of LinKBase®, and its capacity to serve as a translation hub, improving the interoperability of the ontologies through which it navigates
    • …
    corecore