68,847 research outputs found

    Quantum picturalism for topological cluster-state computing

    Full text link
    Topological quantum computing is a way of allowing precise quantum computations to run on noisy and imperfect hardware. One implementation uses surface codes created by forming defects in a highly-entangled cluster state. Such a method of computing is a leading candidate for large-scale quantum computing. However, there has been a lack of sufficiently powerful high-level languages to describe computing in this form without resorting to single-qubit operations, which quickly become prohibitively complex as the system size increases. In this paper we apply the category-theoretic work of Abramsky and Coecke to the topological cluster-state model of quantum computing to give a high-level graphical language that enables direct translation between quantum processes and physical patterns of measurement in a computer - a "compiler language". We give the equivalence between the graphical and topological information flows, and show the applicable rewrite algebra for this computing model. We show that this gives us a native graphical language for the design and analysis of topological quantum algorithms, and finish by discussing the possibilities for automating this process on a large scale.Comment: 18 pages, 21 figures. Published in New J. Phys. special issue on topological quantum computin

    Graphical Reasoning in Compact Closed Categories for Quantum Computation

    Full text link
    Compact closed categories provide a foundational formalism for a variety of important domains, including quantum computation. These categories have a natural visualisation as a form of graphs. We present a formalism for equational reasoning about such graphs and develop this into a generic proof system with a fixed logical kernel for equational reasoning about compact closed categories. Automating this reasoning process is motivated by the slow and error prone nature of manual graph manipulation. A salient feature of our system is that it provides a formal and declarative account of derived results that can include `ellipses'-style notation. We illustrate the framework by instantiating it for a graphical language of quantum computation and show how this can be used to perform symbolic computation.Comment: 21 pages, 9 figures. This is the journal version of the paper published at AIS

    Knowledge-based support in Non-Destructive Testing for health monitoring of aircraft structures

    Get PDF
    Maintenance manuals include general methods and procedures for industrial maintenance and they contain information about principles of maintenance methods. Particularly, Non-Destructive Testing (NDT) methods are important for the detection of aeronautical defects and they can be used for various kinds of material and in different environments. Conventional non-destructive evaluation inspections are done at periodic maintenance checks. Usually, the list of tools used in a maintenance program is simply located in the introduction of manuals, without any precision as regards to their characteristics, except for a short description of the manufacturer and tasks in which they are employed. Improving the identification concepts of the maintenance tools is needed to manage the set of equipments and establish a system of equivalence: it is necessary to have a consistent maintenance conceptualization, flexible enough to fit all current equipment, but also all those likely to be added/used in the future. Our contribution is related to the formal specification of the system of functional equivalences that can facilitate the maintenance activities with means to determine whether a tool can be substituted for another by observing their key parameters in the identified characteristics. Reasoning mechanisms of conceptual graphs constitute the baseline elements to measure the fit or unfit between an equipment model and a maintenance activity model. Graph operations are used for processing answers to a query and this graph-based approach to the search method is in-line with the logical view of information retrieval. The methodology described supports knowledge formalization and capitalization of experienced NDT practitioners. As a result, it enables the selection of a NDT technique and outlines its capabilities with acceptable alternatives

    GLC actors, artificial chemical connectomes, topological issues and knots

    Full text link
    Based on graphic lambda calculus, we propose a program for a new model of asynchronous distributed computing, inspired from Hewitt Actor Model, as well as several investigation paths, concerning how one may graft lambda calculus and knot diagrammatics

    Development of a client interface for a methodology independent object-oriented CASE tool : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Computer Science at Massey University

    Get PDF
    The overall aim of the research presented in this thesis is the development of a prototype CASE Tool user interface that supports the use of arbitrary methodology notations for the construction of small-scale diagrams. This research is part of the larger CASE Tool project, MOOT (Massey's Object Oriented Tool). MOOT is a meta-system with a client-server architecture that provides a framework within which the semantics and syntax of methodologies can be described. The CASE Tool user interface is implemented in Java so it is as portable as possible and has a consistent look and feel. It has been designed as a client to the rest of the MOOT system (which acts as a server). A communications protocol has been designed to support the interaction between the CASE Tool client and a MOOT server. The user interface design of MOOT must support all possible graphical notations. No assumptions about the types of notations that a software engineer may use can be made. MOOT therefore provides a specification language called NDL for the definition of a methodology's syntax. Hence, the MOOT CASE Tool client described in this thesis is a shell that is parameterised by NDL specifications. The flexibility provided by such a high level of abstraction presents significant challenges in terms of designing effective human-computer interaction mechanisms for the MOOT user interface. Functional and non-functional requirements of the client user interface have been identified and applied during the construction of the prototype. A notation specification that defines the syntax for Coad and Yourdon OOA/OOD has been written in NDL and used as a test case. The thesis includes the iterative evaluation and extension of NDL resulting from the prototype development. The prototype has shown that the current approach to NDL is efficacious, and that the syntax and semantics of a methodology description can successfully be separated. The developed prototype has shown that it is possible to build a simple, non-intrusive, and efficient, yet flexible, useable, and helpful interface for meta-CASE tools. The development of the CASE Tool client, through its generic, methodology independent design, has provided a pilot with which future ideas may be explored

    Drawing OWL 2 ontologies with Eddy the editor

    Get PDF
    In this paper we introduce Eddy, a new open-source tool for the graphical editing of OWL~2 ontologies. Eddy is specifically designed for creating ontologies in Graphol, a completely visual ontology language that is equivalent to OWL~2. Thus, in Eddy ontologies are easily drawn as diagrams, rather than written as sets of formulas, as commonly happens in popular ontology design and engineering environments. This makes Eddy particularly suited for usage by people who are more familiar with diagramatic languages for conceptual modeling rather than with typical ontology formalisms, as is often required in non-academic and industrial contexts. Eddy provides intuitive functionalities for specifying Graphol diagrams, guarantees their syntactic correctness, and allows for exporting them in standard OWL 2 syntax. A user evaluation study we conducted shows that Eddy is perceived as an easy and intuitive tool for ontology specification

    DPP-PMRF: Rethinking Optimization for a Probabilistic Graphical Model Using Data-Parallel Primitives

    Full text link
    We present a new parallel algorithm for probabilistic graphical model optimization. The algorithm relies on data-parallel primitives (DPPs), which provide portable performance over hardware architecture. We evaluate results on CPUs and GPUs for an image segmentation problem. Compared to a serial baseline, we observe runtime speedups of up to 13X (CPU) and 44X (GPU). We also compare our performance to a reference, OpenMP-based algorithm, and find speedups of up to 7X (CPU).Comment: LDAV 2018, October 201

    An analysis of the use of graphics for information retrieval

    Get PDF
    Several research groups have addressed the problem of retrieving vector graphics. This work has, however, focused either on domain-dependent areas or was based on very simple graphics languages. Here we take a fresh look at the issue of graphics retrieval in general and in particular at the tasks which retrieval systems must support. The paper presents a series of case studies which explored the needs of professionals in the hope that these needs can help direct future graphics IR research. Suggested modelling techniques for some of the graphic collections are also presented

    The VEX-93 environment as a hybrid tool for developing knowledge systems with different problem solving techniques

    Get PDF
    The paper describes VEX-93 as a hybrid environment for developing knowledge-based and problem solver systems. It integrates methods and techniques from artificial intelligence, image and signal processing and data analysis, which can be mixed. Two hierarchical levels of reasoning contains an intelligent toolbox with one upper strategic inference engine and four lower ones containing specific reasoning models: truth-functional (rule-based), probabilistic (causal networks), fuzzy (rule-based) and case-based (frames). There are image/signal processing-analysis capabilities in the form of programming languages with more than one hundred primitive functions. User-made programs are embeddable within knowledge basis, allowing the combination of perception and reasoning. The data analyzer toolbox contains a collection of numerical classification, pattern recognition and ordination methods, with neural network tools and a data base query language at inference engines's disposal. VEX-93 is an open system able to communicate with external computer programs relevant to a particular application. Metaknowledge can be used for elaborate conclusions, and man-machine interaction includes, besides windows and graphical interfaces, acceptance of voice commands and production of speech output. The system was conceived for real-world applications in general domains, but an example of a concrete medical diagnostic support system at present under completion as a cuban-spanish project is mentioned. Present version of VEX-93 is a huge system composed by about one and half millions of lines of C code and runs in microcomputers under Windows 3.1.Postprint (published version
    corecore