448,050 research outputs found

    IST Austria Thesis

    Get PDF
    This dissertation focuses on algorithmic aspects of program verification, and presents modeling and complexity advances on several problems related to the static analysis of programs, the stateless model checking of concurrent programs, and the competitive analysis of real-time scheduling algorithms. Our contributions can be broadly grouped into five categories. Our first contribution is a set of new algorithms and data structures for the quantitative and data-flow analysis of programs, based on the graph-theoretic notion of treewidth. It has been observed that the control-flow graphs of typical programs have special structure, and are characterized as graphs of small treewidth. We utilize this structural property to provide faster algorithms for the quantitative and data-flow analysis of recursive and concurrent programs. In most cases we make an algebraic treatment of the considered problem, where several interesting analyses, such as the reachability, shortest path, and certain kind of data-flow analysis problems follow as special cases. We exploit the constant-treewidth property to obtain algorithmic improvements for on-demand versions of the problems, and provide data structures with various tradeoffs between the resources spent in the preprocessing and querying phase. We also improve on the algorithmic complexity of quantitative problems outside the algebraic path framework, namely of the minimum mean-payoff, minimum ratio, and minimum initial credit for energy problems. Our second contribution is a set of algorithms for Dyck reachability with applications to data-dependence analysis and alias analysis. In particular, we develop an optimal algorithm for Dyck reachability on bidirected graphs, which are ubiquitous in context-insensitive, field-sensitive points-to analysis. Additionally, we develop an efficient algorithm for context-sensitive data-dependence analysis via Dyck reachability, where the task is to obtain analysis summaries of library code in the presence of callbacks. Our algorithm preprocesses libraries in almost linear time, after which the contribution of the library in the complexity of the client analysis is (i)~linear in the number of call sites and (ii)~only logarithmic in the size of the whole library, as opposed to linear in the size of the whole library. Finally, we prove that Dyck reachability is Boolean Matrix Multiplication-hard in general, and the hardness also holds for graphs of constant treewidth. This hardness result strongly indicates that there exist no combinatorial algorithms for Dyck reachability with truly subcubic complexity. Our third contribution is the formalization and algorithmic treatment of the Quantitative Interprocedural Analysis framework. In this framework, the transitions of a recursive program are annotated as good, bad or neutral, and receive a weight which measures the magnitude of their respective effect. The Quantitative Interprocedural Analysis problem asks to determine whether there exists an infinite run of the program where the long-run ratio of the bad weights over the good weights is above a given threshold. We illustrate how several quantitative problems related to static analysis of recursive programs can be instantiated in this framework, and present some case studies to this direction. Our fourth contribution is a new dynamic partial-order reduction for the stateless model checking of concurrent programs. Traditional approaches rely on the standard Mazurkiewicz equivalence between traces, by means of partitioning the trace space into equivalence classes, and attempting to explore a few representatives from each class. We present a new dynamic partial-order reduction method called the Data-centric Partial Order Reduction (DC-DPOR). Our algorithm is based on a new equivalence between traces, called the observation equivalence. DC-DPOR explores a coarser partitioning of the trace space than any exploration method based on the standard Mazurkiewicz equivalence. Depending on the program, the new partitioning can be even exponentially coarser. Additionally, DC-DPOR spends only polynomial time in each explored class. Our fifth contribution is the use of automata and game-theoretic verification techniques in the competitive analysis and synthesis of real-time scheduling algorithms for firm-deadline tasks. On the analysis side, we leverage automata on infinite words to compute the competitive ratio of real-time schedulers subject to various environmental constraints. On the synthesis side, we introduce a new instance of two-player mean-payoff partial-information games, and show how the synthesis of an optimal real-time scheduler can be reduced to computing winning strategies in this new type of games

    Quantitative Information Flow as Safety and Liveness Hyperproperties

    Full text link
    We employ Clarkson and Schneider's "hyperproperties" to classify various verification problems of quantitative information flow. The results of this paper unify and extend the previous results on the hardness of checking and inferring quantitative information flow. In particular, we identify a subclass of liveness hyperproperties, which we call "k-observable hyperproperties", that can be checked relative to a reachability oracle via self composition.Comment: In Proceedings QAPL 2012, arXiv:1207.055

    Multidimensional hyperbolic billiards

    Full text link
    The theory of planar hyperbolic billiards is already quite well developed by having also achieved spectacular successes. In addition there also exists an excellent monograph by Chernov and Markarian on the topic. In contrast, apart from a series of works culminating in Sim\'anyi's remarkable result on the ergodicity of hard ball systems and other sporadic successes, the theory of hyperbolic billiards in dimension 3 or more is much less understood. The goal of this work is to survey the key results of their theory and highlight some central problems which deserve particular attention and efforts

    A Temporal Logic for Hyperproperties

    Full text link
    Hyperproperties, as introduced by Clarkson and Schneider, characterize the correctness of a computer program as a condition on its set of computation paths. Standard temporal logics can only refer to a single path at a time, and therefore cannot express many hyperproperties of interest, including noninterference and other important properties in security and coding theory. In this paper, we investigate an extension of temporal logic with explicit path variables. We show that the quantification over paths naturally subsumes other extensions of temporal logic with operators for information flow and knowledge. The model checking problem for temporal logic with path quantification is decidable. For alternation depth 1, the complexity is PSPACE in the length of the formula and NLOGSPACE in the size of the system, as for linear-time temporal logic

    Mapping the complexity of higher education in the developing world

    Full text link
    This repository item contains a single issue of Issues in Brief, a series of policy briefs that began publishing in 2008 by the Boston University Frederick S. Pardee Center for the Study of the Longer-Range Future.On October 27 and 28, 2009, a workshop of experts on higher education in developing countries was convened by the Boston University Frederick S. Pardee Center for the Study of the Longer-Range Future. The meeting was supported by a grant from the National Academies Keck Futures Initiative with additional support from the Pardee Center and the Office of the Boston University Provost. The meeting brought together experts in economics, public policy, education, development, university management, and quantitative modeling who had rich experiences across the developing world. These experts offered a variety of conceptual tools with which to look at the particular complexities associated with higher education in developing countries. The meeting was convened by the authors of this paper. This policy brief builds upon and reflects on the discussion at this meeting, but is not a meeting report, per se

    The role of Computer Aided Process Engineering in physiology and clinical medicine

    Get PDF
    This paper discusses the potential role for Computer Aided Process Engineering (CAPE) in developing engineering analysis and design approaches to biological systems across multiple levels—cell signalling networks, gene, protein and metabolic networks, cellular systems, through to physiological systems. The 21st Century challenge in the Life Sciences is to bring together widely dispersed models and knowledge in order to enable a system-wide understanding of these complex systems. This systems level understanding should have broad clinical benefits. Computer Aided Process Engineering can bring systems approaches to (i) improving understanding of these complex chemical and physical (particularly molecular transport in complex flow regimes) interactions at multiple scales in living systems, (ii) analysis of these models to help to identify critical missing information and to explore the consequences on major output variables resulting from disturbances to the system, and (iii) ‘design’ potential interventions in in vivo systems which can have significant beneficial, or potentially harmful, effects which need to be understood. This paper develops these three themes drawing on recent projects at UCL. The first project has modeled the effects of blood flow on endothelial cells lining arteries, taking into account cell shape change resulting in changes in the cell skeleton which cause consequent chemical changes. A second is a project which is building an in silico model of the human liver, tieing together models from the molecular level to the liver. The composite model models glucose regulation in the liver and associated organs. Both projects involve molecular transport, chemical reactions, and complex multiscale systems, tackled by approaches from CAPE. Chemical Engineers solve multiple scale problems in manufacturing processes – from molecular scale through unit operations scale to plant-wide and enterprise wide systems – so have an appropriate skill set for tackling problems in physiology and clinical medicine, in collaboration with life and clinical scientists
    • …
    corecore