64,461 research outputs found

    A program logic for union bounds

    Get PDF
    International audienceWe propose a probabilistic Hoare logic aHL based on the union bound, a tool from basic probability theory. While the union bound is simple, it is an extremely common tool for analyzing randomized algorithms. In formal verification terms, the union bound allows flexible and compos-itional reasoning over possible ways an algorithm may go wrong. It also enables a clean separation between reasoning about probabilities and reasoning about events, which are expressed as standard first-order formulas in our logic. Notably, assertions in our logic are non-probabilistic, even though we can conclude probabilistic facts from the judgments. Our logic can also prove accuracy properties for interactive programs, where the program must produce intermediate outputs as soon as pieces of the input arrive, rather than accessing the entire input at once. This setting also enables adaptivity, where later inputs may depend on earlier intermediate outputs. We show how to prove accuracy for several examples from the differential privacy literature, both interactive and non-interactive. 1998 ACM Subject Classification D.2.4 Software/Program Verification 1 Introduction Probabilistic computations arise naturally in many areas of computer science. For instance, they are widely used in cryptography, privacy, and security for achieving goals that lie beyond the reach of deterministic programs. However, the correctness of probabilistic programs can be quite subtle, often relying on complex reasoning about probabilistic events. Accordingly, probabilistic computations present an attractive target for formal verification. A long line of research, spanning more than four decades, has focused on expressive formalisms for reasoning about general probabilistic properties both for purely probabilistic programs and for programs that combine probabilistic and non-deterministic choice (see, e.g., [29, 34, 35]). More recent research investigates specialized formalisms that work with more restricted assertions and proof techniques, aiming to simplify formal verification. As perhaps the purest examples of this approach, some program logics prove probabilistic properties by working purely with non-probabilistic assertions; we call such systems lightweight logics. Examples include probabilistic relational Hoare logic [3] for proving the reductionist security of cryptographic constructions, and the related approximate probabilistic relational Hoare logic [4] for reasoning about differential privacy. These logics rely on the powerful abstraction of probabilistic couplings to derive probabilistic facts from non-probabilistic assertions [7]

    Towards an Abstract Domain for Resource Analysis of Logic Programs Using Sized Types

    Get PDF
    We present a novel general resource analysis for logic programs based on sized types.Sized types are representations that incorporate structural (shape) information and allow expressing both lower and upper bounds on the size of a set of terms and their subterms at any position and depth. They also allow relating the sizes of terms and subterms occurring at different argument positions in logic predicates. Using these sized types, the resource analysis can infer both lower and upper bounds on the resources used by all the procedures in a program as functions on input term (and subterm) sizes, overcoming limitations of existing analyses and enhancing their precision. Our new resource analysis has been developed within the abstract interpretation framework, as an extension of the sized types abstract domain, and has been integrated into the Ciao preprocessor, CiaoPP. The abstract domain operations are integrated with the setting up and solving of recurrence equations for both, inferring size and resource usage functions. We show that the analysis is an improvement over the previous resource analysis present in CiaoPP and compares well in power to state of the art systems.Comment: Part of WLPE 2013 proceedings (arXiv:1308.2055

    Bounded Refinement Types

    Full text link
    We present a notion of bounded quantification for refinement types and show how it expands the expressiveness of refinement typing by using it to develop typed combinators for: (1) relational algebra and safe database access, (2) Floyd-Hoare logic within a state transformer monad equipped with combinators for branching and looping, and (3) using the above to implement a refined IO monad that tracks capabilities and resource usage. This leap in expressiveness comes via a translation to "ghost" functions, which lets us retain the automated and decidable SMT based checking and inference that makes refinement typing effective in practice.Comment: 14 pages, International Conference on Functional Programming, ICFP 201

    On the Complexity of Nonrecursive XQuery and Functional Query Languages on Complex Values

    Full text link
    This paper studies the complexity of evaluating functional query languages for complex values such as monad algebra and the recursion-free fragment of XQuery. We show that monad algebra with equality restricted to atomic values is complete for the class TA[2^{O(n)}, O(n)] of problems solvable in linear exponential time with a linear number of alternations. The monotone fragment of monad algebra with atomic value equality but without negation is complete for nondeterministic exponential time. For monad algebra with deep equality, we establish TA[2^{O(n)}, O(n)] lower and exponential-space upper bounds. Then we study a fragment of XQuery, Core XQuery, that seems to incorporate all the features of a query language on complex values that are traditionally deemed essential. A close connection between monad algebra on lists and Core XQuery (with ``child'' as the only axis) is exhibited, and it is shown that these languages are expressively equivalent up to representation issues. We show that Core XQuery is just as hard as monad algebra w.r.t. combined complexity, and that it is in TC0 if the query is assumed fixed.Comment: Long version of PODS 2005 pape

    The First-Order Theory of Sets with Cardinality Constraints is Decidable

    Full text link
    We show that the decidability of the first-order theory of the language that combines Boolean algebras of sets of uninterpreted elements with Presburger arithmetic operations. We thereby disprove a recent conjecture that this theory is undecidable. Our language allows relating the cardinalities of sets to the values of integer variables, and can distinguish finite and infinite sets. We use quantifier elimination to show the decidability and obtain an elementary upper bound on the complexity. Precise program analyses can use our decidability result to verify representation invariants of data structures that use an integer field to represent the number of stored elements.Comment: 18 page

    Well-Founded Semantics for Extended Datalog and Ontological Reasoning

    Get PDF
    The Datalog± family of expressive extensions of Datalog has recently been introduced as a new paradigm for query answering over ontologies, which captures and extends several common description logics. It extends plain Datalog by features such as existentially quantified rule heads and, at the same time, restricts the rule syntax so as to achieve decidability and tractability. In this paper, we continue the research on Datalog±. More precisely, we generalize the well-founded semantics (WFS), as the standard semantics for nonmonotonic normal programs in the database context, to Datalog± programs with negation under the unique name assumption (UNA). We prove that for guarded Datalog± with negation under the standard WFS, answering normal Boolean conjunctive queries is decidable, and we provide precise complexity results for this problem, namely, in particular, completeness for PTIME (resp., 2-EXPTIME) in the data (resp., combined) complexity
    • 

    corecore