22 research outputs found

    Some Issues on Ontology Integration

    Get PDF
    The word integration has been used with different meanings in the ontology field. This article aims at clarifying the meaning of the word “integration” and presenting some of the relevant work done in integration. We identify three meanings of ontology “integration”: when building a new ontology reusing (by assembling, extending, specializing or adapting) other ontologies already available; when building an ontology by merging several ontologies into a single one that unifies all of them; when building an application using one or more ontologies. We discuss the different meanings of “integration”, identify the main characteristics of the three different processes and proposethree words to distinguish among those meanings:integration, merge and use

    Ontological Reengineering for Reuse

    Get PDF
    This paper presents the concept of Ontological Reengineering as the process of retrieving and transforming a conceptual model of an existing and implemented ontology into a new, more correct and more complete conceptual model which is reimplemented. Three activities have been identified in this process: reverse engineering, restructuring and forward engineering. The aim of Reverse Engineering is to output a possible conceptual model on the basis of the code in which the ontology is implemented. The goal of Restructuring is to reorganize this initial conceptual model into a new conceptual model, which is built bearing in mind the use of the restructured ontology by the ontology/application that reuses it. Finally, the objective of Forward Engineering is output a new implementation of the ontology. The paper also discusses how the ontological reengineering process has been applied to the Standard-Units ontology [18], which is included in a Chemical-Elements [12] ontology. These two ontologies will be included in a Monatomic-Ions and Environmental-Pollutants ontologies

    Algebraic Algorithm Design and Local Search

    Get PDF
    Formal, mathematically-based techniques promise to play an expanding role in the development and maintenance of the software on which our technological society depends. Algebraic techniques have been applied successfully to algorithm synthesis by the use of algorithm theories and design tactics, an approach pioneered in the Kestrel Interactive Development System (KIDS). An algorithm theory formally characterizes the essential components of a family of algorithms. A design tactic is a specialized procedure for recognizing in a problem specification the structures identified in an algorithm theory and then synthesizing a program. Design tactics are hard to write, however, and much of the knowledge they use is encoded procedurally in idiosyncratic ways. Algebraic methods promise a way to represent algorithm design knowledge declaratively and uniformly. We describe a general method for performing algorithm design that is more purely algebraic than that of KIDS. This method is then applied to local search. Local search is a large and diverse class of algorithms applicable to a wide range of problems; it is both intrinsically important and representative of algorithm design as a whole. A general theory of local search is formalized to describe the basic properties common to all local search algorithms, and applied to several variants of hill climbing and simulated annealing. The general theory is then specialized to describe some more advanced local search techniques, namely tabu search and the Kernighan-Lin heuristic

    Formal Transformations from Graphically-Based Object-Oriented Representations to Theory-Based Specifications

    Get PDF
    Formal software specification has long been touted as a way to increase the quality and reliability of software; however, it remains an intricate, manually intensive activity. An alternative to using formal specifications is to use graphically-based, semi-formal specifications such as those used in many object-oriented specification methodologies. While semi-formal specifications are generally easier to develop and understand, they lack the rigor and precision of formal specification techniques. The basic premise of this investigation is that formal software specifications can be constructed using correctness preserving transformations from graphically-based object-oriented representations. In this investigation, object-oriented specifications defined using Rumbaugh\u27s Object Modeling Technique (OMT) were translated into algebraic specifications. To ensure the correct translation of graphically-based OMT specifications into their algebraic counterparts, a formal semantics for interpreting OMT specifications was derived and an algebraic model of object-orientation was developed. This model defines how object-oriented concepts are represented algebraically using an object-oriented algebraic specification language O-SLANG. O-SLANG combines basic algebraic specification constructs with category theory operations to capture internal object class structure as well as relationships between classes. Next, formal transformations from OMT specifications to O-SLANG specifications were defined and the feasibility of automating these transformations was demonstrated by the development of a proof-of-concept system

    Towards a formally designed and verified embedded operating system: case study using the B method

    Get PDF
    The dramatic growth in practical applications for iris biometrics has been accompanied by relevant developments in the underlying algorithms and techniques. Along with the research focused on near-infrared images captured with subject cooperation, e orts are being made to minimize the trade-o between the quality of the captured data and the recognition accuracy on less constrained environments, where images are obtained at the visible wavelength, at increased distances, over simpli ed acquisition protocols and adverse lightning conditions. At a rst stage, interpolation e ects on normalization process are addressed, pointing the outcomes in the overall recognition error rates. Secondly, a couple of post-processing steps to the Daugman's approach are performed, attempting to increase its performance in the particular unconstrained environments this thesis assumes. Analysis on both frequency and spatial domains and nally pattern recognition methods are applied in such e orts. This thesis embodies the study on how subject recognition can be achieved, without his cooperation, making use of iris data captured at-a-distance, on-the-move and at visible wavelength conditions. Widely used methods designed for constrained scenarios are analyzed

    Multiagent Systems Engineering: A Methodology for Analysis and Design of Multiagent Systems

    Get PDF
    This thesis defines a methodology for the creation of multiagent systems, the Multiagent Systems Engineering (MaSE) methodology. The methodology is a key issue in the development of any complex system and there is currently no standard or widely used methodology in the realm of multiagent systems. MaSE covers the entire software lifecycle, starting from an initial prose specification, and creating a set of formal design documents in a graphical style based on a formal syntax. The final product of MaSE is a diagram describing the deployment of a system of intelligent agents that communicate through structured conversations. MaSE was created with the mention of being supported by an automated design tool. The tool built to support MaSE, agent Tool, is a multiagent system development tool for designing and synthesizing complex multiagent systems

    Towards a formally verified microkernel using the VCC verifier

    Get PDF
    In this thesis we present the design by contract modular approach to formal verification of an industrial real-time microkernel which was not designed with formal verification in mind. The microkernel module targeted is a particular interrupt manager of xLuna Real Time Operating System (RTOS) for embedded systems built by Critical Software S.A. The annotations were verified automatically using the Microsoft Research Verified C Compiler (VCC) tool to reason about concurrency and safety properties of xLuna kernel. The specifications are based in Hoare-style pre- and post-conditions inlined with the real code. xLuna is a microkernel based on the RTEMS Real-Time Operating System. xLuna extends RTEMS for run a GNU/Linux Operating System, providing a runtime multitasking environment for real-time (RTEMS) and non-real-time (Linux) applications. xLuna runs in a preemptable and concurrent environment. Therefore, we use VCC for reasoning about concurrent executions and some functional and safety properties of xLuna microkernel. VCC is an automated verifier for concurrent C programs that is being developed by Microsoft Research, Redmond, USA and European Microsoft Innovation Center (EMIC), Aachen, Germany. VCC is being built and used for operating system verification which makes it suitable for our verification work. Specifications were added to xLuna code following a modular approach to the verification of a specific microkernel module, namely the Interrupt Request (IRQ) module. The Verified C Compiler (VCC) annotations added cover approximately 80% of the IRQ manager C code (the remaining 20% of the code are relative to auxiliary functions outside the scope of our verification work). All the annotations were automatically verified and proven to be correct
    corecore