17 research outputs found

    Quantitative assessment of concept maps for conceptualizing domain ontologies: a case of Quran

    Get PDF
    The use of graphical knowledge representation formalisms with a representational vocabulary agreement of terms of conceptualization of the universe of discourse is a new high potential approach in the ontology engineering and knowledge management context. Initially, concept maps were used in the fields of education and learning. After that, it became popular in other areas due to its flexible and intuitive nature. It was also proven as a useful tool to improve communication in corporate environment. In the field of ontologies, concept maps were explored to be used to facilitate different aspects of ontology development. An essential reason behind this motivation is the structural resemblance of concept maps with the hierarchical structure of ontologies. This research aims to demonstrate quantitative evaluation of 4 different hypotheses related to the effectiveness of using concept maps for ontology conceptualization. The domain of Quran was selected for the purpose of this study and it was conducted in collaboration with the experts from the Centre of Quranic Research, Universiti Malaya, Kuala Lumpur, Malaysia. The results of the hypotheses demonstrated that concept mapping was easy to learn and implement for the majority of the participants. Most of them experienced improvement in domain knowledge regarding the vocabularies used to refer to the structure of organization of the Quran, namely Juz, Surah, Ayats, tafsir, Malay translation, English translation, and relationships among these entities. Therefore, concept maps instilled the element of learning through the conceptualization process and provided a platform for participants to resolve conflicting opinions and ambiguities of terms used immediately

    Providing for Perspectives: The Role of Discourse Analysis in Ontology Concept Formulation and Development

    Get PDF
    Concept formulation and ontology development are problematic to achieve in complex social settings. Previously, we have proposed and illustrated a method to develop an ontology based on grounded theory, whereby the ontology is linked to the social processes involved. Further, distinct actors in the social setting assume perspectives that are often fundamentally different from each other. We have previously argued that perspectivism is a cogent theoretical explanation for the different emergent ontologies. However, a rigorous method for analysing the text, in order to identify these perspectives has been needed. In this paper we propose the identification of perspectives by using discourse analysis to bridge between term identification and clarification of perspectives. We have found that discourse analysis provides the structure and rigour required to establish the presence of perspectives, and that actors use metaphors and the genre of historical stories to bridge between, or link with, other perspectives. It is likely that identifying perspectives and the role of language in linking them will produce ontological modularity that is true to the social setting

    The relevance of perspectivism to the task of modularisation in ontology development

    Get PDF
    An ontology development methodology seeks to provide developers with established principles, processes, practices, methods and activities for developing ontologies (Gasevic et al., 2009). Diverse methodologies have been published for the development of ontologies, and have evolved, based on the diverse experiences of researchers and practitioners, and the development teams who surveyed the benefits and shortcomings of the available methodologies in order to determine the applicability of methodologies to particular contexts. An evaluation of existing ontology development methodologies has identified that the concept formulation process is not well defined, or based on rigorous processes (Castro et al., 2006; Winters &amp; Tolk, 2009). In order for the validity of the social realism of the actors in a social setting to be captured, the perspectives of each actor needs to be acknowledged and incorporated into the concept formulation process / framework. This paper demonstrates how consideration of perspectivism leads to a meaningful modularisation of the resultant ontology.<br /

    Ontologies for Bioinformatics

    Get PDF
    The past twenty years have witnessed an explosion of biological data in diverse database formats governed by heterogeneous infrastructures. Not only are semantics (attribute terms) different in meaning across databases, but their organization varies widely. Ontologies are a concept imported from computing science to describe different conceptual frameworks that guide the collection, organization and publication of biological data. An ontology is similar to a paradigm but has very strict implications for formatting and meaning in a computational context. The use of ontologies is a means of communicating and resolving semantic and organizational differences between biological databases in order to enhance their integration. The purpose of interoperability (or sharing between divergent storage and semantic protocols) is to allow scientists from around the world to share and communicate with each other. This paper describes the rapid accumulation of biological data, its various organizational structures, and the role that ontologies play in interoperability

    Informing epidemic (research) responses in a timely fashion by knowledge management - a Zika virus use case

    Get PDF
    The response of pathophysiological research to emerging epidemics often occurs after the epidemic and, as a consequence, has little to no impact on improving patient outcomes or on developing high-quality evidence to inform clinical management strategies during the epidemic. Rapid and informed guidance of epidemic (research) responses to severe infectious disease outbreaks requires quick compilation and integration of existing pathophysiological knowledge. As a case study we chose the Zika virus (ZIKV) outbreak that started in 2015 to develop a proof-of-concept knowledge repository. To extract data from available sources and build a computationally tractable and comprehensive molecular interaction map we applied generic knowledge management software for literature mining, expert knowledge curation, data integration, reporting and visualization. A multi-disciplinary team of experts, including clinicians, virologists, bioinformaticians and knowledge management specialists, followed a pre-defined workflow for rapid integration and evaluation of available evidence. While conventional approaches usually require months to comb through the existing literature, the initial ZIKV KnowledgeBase (ZIKA KB) was completed within a few weeks. Recently we updated the ZIKA KB with additional curated data from the large amount of literature published since 2016 and made it publicly available through a web interface together with a step-by-step guide to ensure reproducibility of the described use case. In addition, a detailed online user manual is provided to enable the ZIKV research community to generate hypotheses, share knowledge, identify knowledge gaps, and interactively explore and interpret data. A workflow for rapid response during outbreaks was generated, validated and refined and is also made available. The process described here can be used for timely structuring of pathophysiological knowledge for future threats. The resulting structured biological knowledge is a helpful tool for computational data analysis and generation of predictive models and opens new avenues for infectious disease research. ZIKV Knowledgebase is available at www.zikaknowledgebase.eu

    Developing Ontologies withing Decentralized Settings

    Get PDF
    This chapter addresses two research questions: “How should a well-engineered methodology facilitate the development of ontologies within communities of practice?” and “What methodology should be used?” If ontologies are to be developed by communities then the ontology development life cycle should be better understood within this context. This chapter presents the Melting Point (MP), a proposed new methodology for developing ontologies within decentralised settings. It describes how MP was developed by taking best practices from other methodologies, provides details on recommended steps and recommended processes, and compares MP with alternatives. The methodology presented here is the product of direct first-hand experience and observation of biological communities of practice in which some of the authors have been involved. The Melting Point is a methodology engineered for decentralised communities of practice for which the designers of technology and the users may be the same group. As such, MP provides a potential foundation for the establishment of standard practices for ontology engineering

    The use of concept maps during knowledge elicitation in ontology development processes - the nutrigenomics use case

    Get PDF
    Background: Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies

    Doctor of Philosophy

    Get PDF
    dissertationAir medical transport (AMT) is a complex process that requires coordination of aircraft and highly skilled professionals to transport critically ill patients to definitive care. To achieve optimal performance, medical transport services employ quality and safety management systems (QSMS) to report errors and evaluate performance. Unfortunately, there are no standards for classifying miscommunication in these systems. A thoughtfully developed ontology, based upon theoretical models, provides the foundation within a QSMS for reporting communication errors and standardizing analysis. This research used a mixed-methods, pre-post design, with four distinct studies to analyze communication at the Life Flight AMT service. Study 1 was a qualitative study of communication and miscommunication. Study 2 (pre) was a quantitative study measuring communication errors in reports to the QSMS. Study 3 developed a new communication ontology for the QSMS to improve reporting and analysis of communication errors. Study 4 (post) implemented the new ontology and evaluated its performance for analyzing communication errors in the QSMS. Study 1 showed that communication in this AMT service is a complex process that may require more than 28 communication interactions between 10 or more people and utilize as many as 6 different communication technologies. Omissions of information were the most frequent communication errors described. Study 2 revealed that Life Flight's ontology in their QSMS was inadequate for measuring communication errors. iii Two hundred seventy-eight event reports were reviewed from the QSMS with 58 (21%) having evidence of a communication error during transport. Of those 58 reports, only 18 (31%) could be retrieved by a simple query. A new, theory-based, communication ontology was developed in Study 3. Study 4 showed the new communication ontology more than doubled the ability to retrieve reports with communication errors by simple query of the QSMS (71%). Furthermore, analysis showed that 50% of communication errors occurred at the initial phase of transport. The most frequent errors were information not being forwarded to key persons (37%). This research provided the foundation for describing and measuring communication errors in an AMT Service. Further research is needed to identify strategies that will improve information distribution between persons involved with patient transport

    A novel and validated agile Ontology Engineering methodology for the development of ontology-based applications

    Get PDF
    The goal of this Thesis is to investigate the status of Ontology Engineering, underlining the main key issues still characterizing this discipline. Among these issues, the problem of reconciling macro-level methodologies with authoring techniques is pivotal in supporting novel ontology engineers. The latest approach characterizing ontology engineering methodologies leverages the agile paradigm to support collaborative ontology development and deliver efficient ontologies. However, so far, the investigations in the current support provided by these methodologies and the delivery of efficient ontologies have not been investigated. Thus, this work proposes a novel framework for the investigation of agile methodologies, with the objective of identifying the strong point of each agile methodology and their limitations. Leveraging on the findings of this analysis, the Thesis introduces a novel agile methodology – AgiSCOnt – aimed at tackling some of the key issues characterizing Ontology Engineering and weaknesses identified in existing agile approaches. The novel methodology is then put to the test as it is adopted for the development of two new domain ontologies in the field of health: the first is dedicated to patients struggling with dysphagia, while the second addresses patients affected by Chronic obstructive pulmonary disease.The goal of this Thesis is to investigate the status of Ontology Engineering, underlining the main key issues still characterizing this discipline. Among these issues, the problem of reconciling macro-level methodologies with authoring techniques is pivotal in supporting novel ontology engineers. The latest approach characterizing ontology engineering methodologies leverages the agile paradigm to support collaborative ontology development and deliver efficient ontologies. However, so far, the investigations in the current support provided by these methodologies and the delivery of efficient ontologies have not been investigated. Thus, this work proposes a novel framework for the investigation of agile methodologies, with the objective of identifying the strong point of each agile methodology and their limitations. Leveraging on the findings of this analysis, the Thesis introduces a novel agile methodology – AgiSCOnt – aimed at tackling some of the key issues characterizing Ontology Engineering and weaknesses identified in existing agile approaches. The novel methodology is then put to the test as it is adopted for the development of two new domain ontologies in the field of health: the first is dedicated to patients struggling with dysphagia, while the second addresses patients affected by Chronic obstructive pulmonary disease
    corecore