842 research outputs found

    Compositional closure for Bayes Risk in probabilistic noninterference

    Full text link
    We give a sequential model for noninterference security including probability (but not demonic choice), thus supporting reasoning about the likelihood that high-security values might be revealed by observations of low-security activity. Our novel methodological contribution is the definition of a refinement order and its use to compare security measures between specifications and (their supposed) implementations. This contrasts with the more common practice of evaluating the security of individual programs in isolation. The appropriateness of our model and order is supported by our showing that our refinement order is the greatest compositional relation --the compositional closure-- with respect to our semantics and an "elementary" order based on Bayes Risk --- a security measure already in widespread use. We also relate refinement to other measures such as Shannon Entropy. By applying the approach to a non-trivial example, the anonymous-majority Three-Judges protocol, we demonstrate by example that correctness arguments can be simplified by the sort of layered developments --through levels of increasing detail-- that are allowed and encouraged by compositional semantics

    Abstraction and mining of traces to explain concurrency bugs

    Get PDF

    WordSup: Exploiting Word Annotations for Character based Text Detection

    Full text link
    Imagery texts are usually organized as a hierarchy of several visual elements, i.e. characters, words, text lines and text blocks. Among these elements, character is the most basic one for various languages such as Western, Chinese, Japanese, mathematical expression and etc. It is natural and convenient to construct a common text detection engine based on character detectors. However, training character detectors requires a vast of location annotated characters, which are expensive to obtain. Actually, the existing real text datasets are mostly annotated in word or line level. To remedy this dilemma, we propose a weakly supervised framework that can utilize word annotations, either in tight quadrangles or the more loose bounding boxes, for character detector training. When applied in scene text detection, we are thus able to train a robust character detector by exploiting word annotations in the rich large-scale real scene text datasets, e.g. ICDAR15 and COCO-text. The character detector acts as a key role in the pipeline of our text detection engine. It achieves the state-of-the-art performance on several challenging scene text detection benchmarks. We also demonstrate the flexibility of our pipeline by various scenarios, including deformed text detection and math expression recognition.Comment: 2017 International Conference on Computer Visio

    Rigorous concurrency analysis of multithreaded programs

    Get PDF
    technical reportThis paper explores the practicality of conducting program analysis for multithreaded software using constraint solv- ing. By precisely defining the underlying memory consis- tency rules in addition to the intra-thread program seman- tics, our approach orders a unique advantage for program ver- ification | it provides an accurate and exhaustive coverage of all thread interleavings for any given memory model. We demonstrate how this can be achieved by formalizing sequen- tial consistency for a source language that supports control branches and a monitor-style mutual exclusion mechanism. We then discuss how to formulate programmer expectations as constraints and propose three concrete applications of this approach: execution validation, race detection, and atom- icity analysis. Finally, we describe the implementation of a formal analysis tool using constraint logic programming, with promising initial results for reasoning about small but non-trivial concurrent programs

    A Semantic Framework for the Security Analysis of Ethereum smart contracts

    Full text link
    Smart contracts are programs running on cryptocurrency (e.g., Ethereum) blockchains, whose popularity stem from the possibility to perform financial transactions, such as payments and auctions, in a distributed environment without need for any trusted third party. Given their financial nature, bugs or vulnerabilities in these programs may lead to catastrophic consequences, as witnessed by recent attacks. Unfortunately, programming smart contracts is a delicate task that requires strong expertise: Ethereum smart contracts are written in Solidity, a dedicated language resembling JavaScript, and shipped over the blockchain in the EVM bytecode format. In order to rigorously verify the security of smart contracts, it is of paramount importance to formalize their semantics as well as the security properties of interest, in particular at the level of the bytecode being executed. In this paper, we present the first complete small-step semantics of EVM bytecode, which we formalize in the F* proof assistant, obtaining executable code that we successfully validate against the official Ethereum test suite. Furthermore, we formally define for the first time a number of central security properties for smart contracts, such as call integrity, atomicity, and independence from miner controlled parameters. This formalization relies on a combination of hyper- and safety properties. Along this work, we identified various mistakes and imprecisions in existing semantics and verification tools for Ethereum smart contracts, thereby demonstrating once more the importance of rigorous semantic foundations for the design of security verification techniques.Comment: The EAPLS Best Paper Award at ETAP

    A Generic Approach to Flow-Sensitive Polymorphic Effects

    Get PDF
    Effect systems are lightweight extensions to type systems that can verify a wide range of important properties with modest developer burden. But our general understanding of effect systems is limited primarily to systems where the order of effects is irrelevant. Understanding such systems in terms of a lattice of effects grounds understanding of the essential issues, and provides guidance when designing new effect systems. By contrast, sequential effect systems --- where the order of effects is important --- lack a clear algebraic characterization. We derive an algebraic characterization from the shape of prior concrete sequential effect systems. We present an abstract polymorphic effect system with singleton effects parameterized by an effect quantale --- an algebraic structure with well-defined properties that can model a range of existing order-sensitive effect systems. We define effect quantales, derive useful properties, and show how they cleanly model a variety of known sequential effect systems. We show that effect quantales provide a free, general notion of iterating a sequential effect, and that for systems we consider the derived iteration agrees with the manually designed iteration operators in prior work. Identifying and applying the right algebraic structure led us to subtle insights into the design of order-sensitive effect systems, which provides guidance on non-obvious points of designing order-sensitive effect systems. Effect quantales have clear relationships to the recent category theoretic work on order-sensitive effect systems, but are explained without recourse to category theory. In addition, our derived iteration construct should generalize to these semantic structures, addressing limitations of that work

    Steps Towards a Method for the Formal Modeling of Dynamic Objects

    Get PDF
    Fragments of a method to formally specify object-oriented models of a universe of discourse are presented. The task of finding such models is divided into three subtasks, object classification, event specification, and the specification of the life cycle of an object. Each of these subtasks is further subdivided, and for each of the subtasks heuristics are given that can aid the analyst in deciding how to represent a particular aspect of the real world. The main sources of inspiration are Jackson System Development, algebraic specification of data- and object types, and algebraic specification of processes
    • …
    corecore