2,871 research outputs found
Recommended from our members
Toward the automation of business process ontology generation
Semantic Business Process Management (SBPM) utilises semantic technologies (e.g., ontology) to model and query process representations. There are times in which such models must be reconstructed from existing textual documentation. In this scenario the automated generation of ontological models would be preferable, however current methods and technology are still not capable of automatically generating accurate semantic process models from textual descriptions. This research attempts to automate the process as much as possible by proposing a method that drives the transformation through the joint use of a foundational ontology and lexico-semantic analysis. The method is presented, demonstrated and evaluated. The original dataset represents 150 business activities related to the procurement processes of a case study company. As the evaluation shows, the proposed method can accurately map the linguistic patterns of the process descriptions to semantic patterns of the foundational ontology to a high level of accuracy, however further research is required in order to reduce the level of human intervention, expand the method so as to recognise further patterns of the foundational ontology and develop a tool to assist the business process modeller in the semi-automated generation of process models
Recommended from our members
Exploiting a perdurantist foundational ontology and graph database for semantic data integration
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University London.The view of reality that is inherent to perdurantist philosophical ontologies, often termed four dimensional (4D) ontologies, has not been widely adopted within the mainstream of information system design practice. However, as the closed world of enterprise systems is opened to Internet scale Semantic Web and Open Data information sources, there is a need to better understand the semantics of both internal and external data and how they can be integrated. Philosophical foundational ontologies can help establish this understanding and there is, therefore, an emerging need to research how they can be applied to the problem of semantic data integration. Therefore, a prime objective of this research was to develop a framework through which to apply a 4D foundational ontology and a graph database to the problem of semantic data integration, and to assess the effectiveness of the approach. The research employed design science, a methodology which is applicable to undertaking research within information systems as it encompasses methods through which the research can be undertaken and the resultant artefacts evaluated. This methodology has a number of discrete stages: problem awareness; a core design-build-evaluate iterative cycle through which the research is conducted; and a conclusion stage. The design science research was conducted through the development of a number of artefacts, the prime being the 4D-Semantic Extract Load (4D-SETL) framework. The effectiveness of the framework was assessed by applying it to semantically interpret and integrate a number of large scale datasets and to instantiate a prototype graph database warehouse to persist the resultant ontology. A series of technical experiments confirmed that directly reflecting the model patterns of 4D ontology within a prototype data warehouse proved an effective means of both structuring and semantically integrating complex datasets and that the artefacts produced by 4D-SETL could function at scale. Through illustrative scenario, the effectiveness of the approach is described in relation to the ability of the framework to address a number of weaknesses in current approaches. Furthermore the major advantages of the 4D-SETL are elaborated; which include ability of the framework is to combine foundational, domain and instance level ontological models in a single coherent system that dispensed with much of the translation normally undertaken between conceptual, logical and physical data models. Additionally, adopting a perdurantist realist foundational ontology provided a clear means of establishing and maintaining the identity of physical objects as their constituent temporal and spatial parts unfold over the course of tim
A Linked Data Approach to Sharing Workflows and Workflow Results
A bioinformatics analysis pipeline is often highly elaborate, due to the inherent complexity of biological systems and the variety and size of datasets. A digital equivalent of the âMaterials and Methodsâ section in wet laboratory publications would be highly beneficial to bioinformatics, for evaluating evidence and examining data across related experiments, while introducing the potential to find associated resources and integrate them as data and services. We present initial steps towards preserving bioinformatics âmaterials and methodsâ by exploiting the workflow paradigm for capturing the design of a data analysis pipeline, and RDF to link the workflow, its component services, run-time provenance, and a personalized biological interpretation of the results. An example shows the reproduction of the unique graph of an analysis procedure, its results, provenance, and personal interpretation of a text mining experiment. It links data from Taverna, myExperiment.org, BioCatalogue.org, and ConceptWiki.org. The approach is relatively âlight-weightâ and unobtrusive to bioinformatics users
Semantic interoperability: ontological unpacking of a viral conceptual model
Background. Genomics and virology are unquestionably important, but complex, domains being investigated by a large number of scientists. The need to facilitate and support work within these domains requires sharing of databases, although it is often difficult to do so because of the different ways in which data is represented across the databases. To foster semantic interoperability, models are needed that provide a deep understanding and interpretation of the concepts in a domain, so that the data can be consistently interpreted among researchers.
Results. In this research, we propose the use of conceptual models to support semantic interoperability among databases and assess their ontological clarity to support their effective use. This modeling effort is illustrated by its application to the Viral Conceptual Model (VCM) that captures and represents the sequencing of viruses, inspired by the need to understand the genomic aspects of the virus responsible for COVID-19. For achieving semantic clarity on the VCM, we leverage the âontological unpackingâ method, a process of ontological analysis that reveals the ontological foundation of the information that is represented in a conceptual model. This is accomplished by applying the stereotypes of the OntoUML ontology-driven conceptual modeling language.As a result, we propose a new OntoVCM, an ontologically grounded model, based on the initial VCM, but with guaranteed interoperability among the data sources that employ it.
Conclusions. We propose and illustrate how the unpacking of the Viral Conceptual Model resolves several issues related to semantic interoperability, the importance of which is recognized by the âIâ in FAIR principles. The research addresses conceptual uncertainty within the domain of SARS-CoV-2 data and knowledge.The method employed provides the basis for further analyses of complex models currently used in life science applications, but lacking ontological grounding, subsequently hindering the interoperability needed for scientists to progress their research
European standardization efforts from FAIR toward explainable-AI-ready data documentation in materials modelling
Security critical AI applications require a standardized and interoperable data and metadata documentation that makes the source data explainable-AI ready (XAIR). Within the domain of materials modelling and characterization, European initiatives have proposed a series of metadata standards and procedural recommendations that were accepted as CEN workshop agreements (CWAs): CWA 17284 MODA, CWA 17815 CHADA, and CWA 17960 ModGra. It is discussed how these standards have been ontologized, and gaps are identified as regards the epistemic grounding metadata, i.e., an annotation of data and claims by something that substantiates whether, why, and to what extent they are indeed knowledge and can be relied upon.European standardization efforts from FAIR toward explainable-AI-ready data documentation in materials modellingsubmittedVersio
Recommended from our members
A Survey of Top-Level Ontologies - to inform the ontological choices for a Foundation Data Model
The Centre for Digital Built Britain has been tasked through the Digital Framework Task Group to develop an Information Management Framework (IMF) to support the development of a National Digital Twin (NDT) as set out in âThe Pathway to an Information Management Frameworkâ (Hetherington, 2020). A key component of the IMF is a Foundation Data Model (FDM),
built upon a top-level ontology (TLO), as a basis for ensuring consistent data across the NDT. This document captures the results collected from a broad survey of top-level ontologies, conducted by the IMF technical team. It focuses on the core ontological choices made in their foundations and
the pragmatic engineering consequences these have on how the ontologies can be applied and further scaled. This document will provide the basis for discussions on a suitable TLO for the FDM. It is also expected that these top-level ontologies will provide a resource whose components can be harvested and adapted for inclusion in the FDM
Recommended from our members
The manufacturing domain ontology for simplifying interoperability of systems for contact lens manufacture
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe advances in manufacturing systems and automation in recent years has been vast. The mechatronic age is apparent in any highly automated manufacturing facility. There is however, an emerging need to improve the domain model to keep pace. Much has been written regarding âbig dataâ and the interoperability of systems, the detrimental effects of dirty data.
This research sets out to review the current state of manufacturing domain data, the options for manipulating data while an organisation grows through natural growth and acquisitions. The research adopted a design science methodology to guide through an iterative process, adding more data and creating a higher level of understanding through each cycle. This research follows using a 4D foundation ontology to model the products and processes of a test case manufacturing organisation creating a minimum viable ontology.
The accomplishments of this thesis are designing, building a foundation grounded ontological domain model for contact lenses and contact lens manufacturing process. In the final cycles of the evaluation, to empirically demonstrate the use of a philosophical foundational ontology within a legacy data conversion process enhancing the interoperability of systems by identifying and cleaning erroneous data. This research demonstrates the use of data representation, modelling using space time maps to explain both simple and complex domain data interactions within products and manufacturing processes
Toward an Ontology for Third Generation Systems Thinking
Systems thinking is a way of making sense about the world in terms of
multilevel, nested, interacting systems, their environment, and the boundaries
between the systems and the environment. In this paper we discuss the evolution
of systems thinking and discuss what is needed for an ontology of the current
generation of systems thinking
- âŠ