1,665 research outputs found

    (Un)decidable Problems about Reachability of Quantum Systems

    Full text link
    We study the reachability problem of a quantum system modelled by a quantum automaton. The reachable sets are chosen to be boolean combinations of (closed) subspaces of the state space of the quantum system. Four different reachability properties are considered: eventually reachable, globally reachable, ultimately forever reachable, and infinitely often reachable. The main result of this paper is that all of the four reachability properties are undecidable in general; however, the last three become decidable if the reachable sets are boolean combinations without negation

    Program Equivalence Checking for Automatic Recognition of Quantum-Compatible Code

    Get PDF
    The techniques used to program quantum computers are somewhat crude. As quantum computing progresses and becomes mainstream, a more efficient method of programming these devices would be beneficial. We propose a method that applies today’s programming techniques to quantum computing, with program equivalence checking used to discern between code suited for execution on a conventional computer and a quantum computer. This process involves determining a quantum algorithm’s implementation using a programming language. This so-called benchmark implementation can be checked against code written by a programmer, with semantic equivalence between the two implying the programmer’s code should be executed on a quantum computer instead of a conventional computer. Using a novel compiler optimization verification tool named CORK, we test for semantic equivalence between a portion of Shor’s algorithm (representing the benchmark implementation) and various modified versions of this code (representing the arbitrary code written by a programmer). Some of the modified versions are intended to be semantically equivalent to the benchmark while others semantically inequivalent. Our testing shows that CORK is able to correctly determine semantic equivalence or semantic inequivalence in a majority of cases

    Leveraging Datapath Propagation in IC3 for Hardware Model Checking

    Full text link
    IC3 is a famous bit-level framework for safety verification. By incorporating datapath abstraction, a notable enhancement in the efficiency of hardware verification can be achieved. However, datapath abstraction entails a coarse level of abstraction where all datapath operations are approximated as uninterpreted functions. This level of abstraction, albeit useful, can lead to an increased computational burden during the verification process as it necessitates extensive exploration of redundant abstract state space. In this paper, we introduce a novel approach called datapath propagation. Our method involves leveraging concrete constant values to iteratively compute the outcomes of relevant datapath operations and their associated uninterpreted functions. Meanwhile, we generate potentially useful datapath propagation lemmas in abstract state space and tighten the datapath abstraction. With this technique, the abstract state space can be reduced, and the verification efficiency is significantly improved. We implemented the proposed approach and conducted extensive experiments. The results show promising improvements of our approach compared to the state-of-the-art verifiers

    Transforming imperative programs

    Get PDF

    Compositional software verification based on game semantics

    Get PDF
    One of the major challenges in computer science is to put programming on a firmer mathematical basis, in order to improve the correctness of computer programs. Automatic program verification is acknowledged to be a very hard problem, but current work is reaching the point where at least the foundational�· aspects of the problem can be addressed and it is becoming a part of industrial software development. This thesis presents a semantic framework for verifying safety properties of open sequ;ptial programs. The presentation is focused on an Algol-like programming language that embodies many of the core ingredients of imperative and functional languages and incorporates data abstraction in its syntax. Game semantics is used to obtain a compositional, incremental way of generating accurate models of programs. Model-checking is made possible by giving certain kinds of concrete automata-theoretic representations of the model. A data-abstraction refinement procedure is developed for model-checking safety properties of programs with infinite integer types. The procedure starts by model-checking the most abstract version of the program. If no counterexample, or a genuine one, is found, the procedure terminates. Otherwise, it uses a spurious counterexample to refine the abstraction for the next iteration. Abstraction refinement, assume-guarantee reasoning and the L* algorithm for learning regular languages are combined to yield a procedure for compositional verification. Construction of a global model is avoided using assume-guarantee reasoning and the L* algorithm, by learning assumptions for arbitrary subprograms. An implementation based on the FDR model checker for the CSP process algebra demonstrates practicality of the methods

    NONUNIFORMLY AND RANDOMLY SAMPLED SYSTEMS

    Get PDF
    Problems with missing data, sampling irregularities and randomly sampled systems are the topics covered by this dissertation. The spectral analysis of a series of periodically repeated sampling patterns is developed. Parameter estimation of autoregressive moving average models using partial observations and an algorithm to fill in the missing data are proved and demonstrated by simulation programs. Interpolation of missing data using bandlimiting assumptions and discrete Fourier transform techniques is developed. Representation and analysis of randomly sampled linear systems with independent and identically distributed sampling intervals are studied. The mean, and the mean-square behavior of a multiple-input multiple-output randomly sampled system are found. A definition of and results concerning the power spectral density gain are also given. A complete FORTRAN simulation package is developed and implemented in a microcomputer environment demonstrating the new algorithms

    Two-dimensional models of type theory

    Full text link
    We describe a non-extensional variant of Martin-L\"of type theory which we call two-dimensional type theory, and equip it with a sound and complete semantics valued in 2-categories.Comment: 46 pages; v2: final journal versio

    A study and evaluation of image analysis techniques applied to remotely sensed data

    Get PDF
    An analysis of phenomena causing nonlinearities in the transformation from Landsat multispectral scanner coordinates to ground coordinates is presented. Experimental results comparing rms errors at ground control points indicated a slight improvement when a nonlinear (8-parameter) transformation was used instead of an affine (6-parameter) transformation. Using a preliminary ground truth map of a test site in Alabama covering the Mobile Bay area and six Landsat images of the same scene, several classification methods were assessed. A methodology was developed for automatic change detection using classification/cluster maps. A coding scheme was employed for generation of change depiction maps indicating specific types of changes. Inter- and intraseasonal data of the Mobile Bay test area were compared to illustrate the method. A beginning was made in the study of data compression by applying a Karhunen-Loeve transform technique to a small section of the test data set. The second part of the report provides a formal documentation of the several programs developed for the analysis and assessments presented

    Towards feasible, machine-assisted verification of object-oriented programs

    Get PDF
    This thesis provides an account of a development of tools towards making verification of object-oriented programs more feasible. We note that proofs in program verification logics are typically long, yet, mathematically, not very deep; these observations suggest the thesis that computers can significantly ease the burden of program verification. We give evidence supporting this by applying computers to (1) automatically check and (2) automatically infer large parts of proofs. Taking the logic (AL) of Abadi and Leino as our starting point, we initially show how the logic can be embedded into a higher-order logic theorem prover, by way of introducing axioms, using a mix of both higher-order abstract syntax (HOAS) and a direct embedding of the assertion logic. The tenacity and exactness of the theorem prover ensures that no proof obligation is inadvertently lost during construction of a proof; we inherit any automatic facilities such as tactics which take us part way towards goal (2); and moreover, we achieve goal (1), since we inherit machine proofs which can be checked automatically. We present som

    Equivalence-preserving first-order unfold/fold transformation systems

    Get PDF
    AbstractTwo unfold/fold transformation systems for first-order programs, one basic and the other extended, are presented. The systems comprise an unfolding rule, a folding rule and a replacement rule. They are intended to work with a first-order theory Δ specifying the meaning of primitives, on top of which new relations are built by programs. They preserve the provability relationship Δ ∪ Γ ⊬ G between a call-consistent program Γ and a goal formula G such that Γ is strict with respect to G. They also preserve the logical consequence relationship in three-valued logic
    • …
    corecore