36,617 research outputs found

    Toward the automation of business process ontology generation

    Get PDF
    Semantic Business Process Management (SBPM) utilises semantic technologies (e.g., ontology) to model and query process representations. There are times in which such models must be reconstructed from existing textual documentation. In this scenario the automated generation of ontological models would be preferable, however current methods and technology are still not capable of automatically generating accurate semantic process models from textual descriptions. This research attempts to automate the process as much as possible by proposing a method that drives the transformation through the joint use of a foundational ontology and lexico-semantic analysis. The method is presented, demonstrated and evaluated. The original dataset represents 150 business activities related to the procurement processes of a case study company. As the evaluation shows, the proposed method can accurately map the linguistic patterns of the process descriptions to semantic patterns of the foundational ontology to a high level of accuracy, however further research is required in order to reduce the level of human intervention, expand the method so as to recognise further patterns of the foundational ontology and develop a tool to assist the business process modeller in the semi-automated generation of process models

    Improving automation standards via semantic modelling: Application to ISA88

    Get PDF
    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking methodPeer ReviewedPostprint (author's final draft

    A Logic-based Approach for Recognizing Textual Entailment Supported by Ontological Background Knowledge

    Full text link
    We present the architecture and the evaluation of a new system for recognizing textual entailment (RTE). In RTE we want to identify automatically the type of a logical relation between two input texts. In particular, we are interested in proving the existence of an entailment between them. We conceive our system as a modular environment allowing for a high-coverage syntactic and semantic text analysis combined with logical inference. For the syntactic and semantic analysis we combine a deep semantic analysis with a shallow one supported by statistical models in order to increase the quality and the accuracy of results. For RTE we use logical inference of first-order employing model-theoretic techniques and automated reasoning tools. The inference is supported with problem-relevant background knowledge extracted automatically and on demand from external sources like, e.g., WordNet, YAGO, and OpenCyc, or other, more experimental sources with, e.g., manually defined presupposition resolutions, or with axiomatized general and common sense knowledge. The results show that fine-grained and consistent knowledge coming from diverse sources is a necessary condition determining the correctness and traceability of results.Comment: 25 pages, 10 figure

    The Space Object Ontology

    Get PDF
    Achieving space domain awareness requires the identification, characterization, and tracking of space objects. Storing and leveraging associated space object data for purposes such as hostile threat assessment, object identification, and collision prediction and avoidance present further challenges. Space objects are characterized according to a variety of parameters including their identifiers, design specifications, components, subsystems, capabilities, vulnerabilities, origins, missions, orbital elements, patterns of life, processes, operational statuses, and associated persons, organizations, or nations. The Space Object Ontology provides a consensus-based realist framework for formulating such characterizations in a computable fashion. Space object data are aligned with classes and relations in the Space Object Ontology and stored in a dynamically updated Resource Description Framework triple store, which can be queried to support space domain awareness and the needs of spacecraft operators. This paper presents the core of the Space Object Ontology, discusses its advantages over other approaches to space object classification, and demonstrates its ability to combine diverse sets of data from multiple sources within an expandable framework. Finally, we show how the ontology provides benefits for enhancing and maintaining longterm space domain awareness

    Who Cares about Axiomatization? Representation, Invariance, and Formal Ontologies

    Get PDF
    The philosophy of science of Patrick Suppes is centered on two important notions that are part of the title of his recent book (Suppes 2002): Representation and Invariance. Representation is important because when we embrace a theory we implicitly choose a way to represent the phenomenon we are studying. Invariance is important because, since invariants are the only things that are constant in a theory, in a way they give the “objective” meaning of that theory. Every scientific theory gives a representation of a class of structures and studies the invariant properties holding in that class of structures. In Suppes’ view, the best way to define this class of structures is via axiomatization. This is because a class of structures is given by a definition, and this same definition establishes which are the properties that a single structure must possess in order to belong to the class. These properties correspond to the axioms of a logical theory. In Suppes’ view, the best way to characterize a scientific structure is by giving a representation theorem for its models and singling out the invariants in the structure. Thus, we can say that the philosophy of science of Patrick Suppes consists in the application of the axiomatic method to scientific disciplines. What I want to argue in this paper is that this application of the axiomatic method is also at the basis of a new approach that is being increasingly applied to the study of computer science and information systems, namely the approach of formal ontologies. The main task of an ontology is that of making explicit the conceptual structure underlying a certain domain. By “making explicit the conceptual structure” we mean singling out the most basic entities populating the domain and writing axioms expressing the main properties of these primitives and the relations holding among them. So, in both cases, the axiomatization is the main tool used to characterize the object of inquiry, being this object scientific theories (in Suppes’ approach), or information systems (for formal ontologies). In the following section I will present the view of Patrick Suppes on the philosophy of science and the axiomatic method, in section 3 I will survey the theoretical issues underlying the work that is being done in formal ontologies and in section 4 I will draw a comparison of these two approaches and explore similarities and differences between them

    Construct redundancy in process modelling grammars: Improving the explanatory power of ontological analysis

    Get PDF
    Conceptual modelling supports developers and users of information systems in areas of documentation, analysis or system redesign. The ongoing interest in the modelling of business processes has led to a variety of different grammars, raising the question of the quality of these grammars for modelling. An established way of evaluating the quality of a modelling grammar is by means of an ontological analysis, which can determine the extent to which grammars contain construct deficit, overload, excess or redundancy. While several studies have shown the relevance of most of these criteria, predictions about construct redundancy have yielded inconsistent results in the past, with some studies suggesting that redundancy may even be beneficial for modelling in practice. In this paper we seek to contribute to clarifying the concept of construct redundancy by introducing a revision to the ontological analysis method. Based on the concept of inheritance we propose an approach that distinguishes between specialized and distinct construct redundancy. We demonstrate the potential explanatory power of the revised method by reviewing and clarifying previous results found in the literature

    Changing narratives: colonised peoples, criminology and social work

    Get PDF
    Abstract: There is growing recognition in criminology and social work of the importance of Indigenous knowledges and methodologies. Yet to date there have been limited attempts (particularly in criminology and criminal justice social work) to consider the theoretical and practice implications of Indigenous understandings and approaches to these disciplines. Both disciplines have also been slow to recognise the importance of understanding the way in which colonial effects are perpetuated through knowledge control, particularly in the operation of criminal justice systems. Our paper thus begins by examining the historical and institutional factors that have contributed to the continuing subjugation of Indigenous knowledges and methodologies. A discussion of the connections between the hegemony of Western science, the construction of race, and the colonial project follows. While herein Western and Indigenous approaches are conceptualised broadly, the dangers of over-simplifying these categories is also acknowledged. The paper proceeds by examining the distinctive character of each approach through a consideration of their ontological, epistemological, axiological, and methodological differences. Whilst acknowledging the considerable challenges which arise in any attempt to develop connections between these differing worldviews, a pathway forward for understanding both theoretically and methodologically the relationship between Western and Indigenous approaches is proposed

    An improved ontological representation of dendritic cells as a paradigm for all cell types

    Get PDF
    The Cell Ontology (CL) is designed to provide a standardized representation of cell types for data annotation. Currently, the CL employs multiple is_a relations, defining cell types in terms of histological, functional, and lineage properties, and the majority of definitions are written with sufficient generality to hold across multiple species. This approach limits the CL’s utility for cross-species data integration. To address this problem, we developed a method for the ontological representation of cells and applied this method to develop a dendritic cell ontology (DC-CL). DC-CL subtypes are delineated on the basis of surface protein expression, systematically including both species-general and species-specific types and optimizing DC-CL for the analysis of flow cytometry data. This approach brings benefits in the form of increased accuracy, support for reasoning, and interoperability with other ontology resources. 104. Barry Smith, “Toward a Realistic Science of Environments”, Ecological Psychology, 2009, 21 (2), April-June, 121-130. Abstract: The perceptual psychologist J. J. Gibson embraces a radically externalistic view of mind and action. We have, for Gibson, not a Cartesian mind or soul, with its interior theater of contents and the consequent problem of explaining how this mind or soul and its psychological environment can succeed in grasping physical objects external to itself. Rather, we have a perceiving, acting organism, whose perceptions and actions are always already tuned to the parts and moments, the things and surfaces, of its external environment. We describe how on this basis Gibson sought to develop a realist science of environments which will be ‘consistent with physics, mechanics, optics, acoustics, and chemistry’
    • 

    corecore