16 research outputs found

    Bounded Model Checking of Industrial Code

    Get PDF
    Abstract: Bounded Model Checking(BMC) is an effective and precise static analysis technique that reduces program verification to satisfiability (SAT) solving. However, with a few exceptions, BMC is not actively used in software industry, especially, when compared to dynamic analysis techniques such as fuzzing, or light-weight formal static analysis. This thesis describes our experience of applying BMC to industrial code using a novel BMC tool SEABMC. We present three contributions: First, a case study of (re)verifying the aws-c-common library from AWS using SEABMC and KLEE. This study explores the methodology from the perspective of three research questions: (a) can proof artifacts be used across verification tools; (b) are there bugs in verified code; and (c) can specifications be improved. To study these questions, we port the verification tasks for aws-c-common library to SEAHORN and KLEE. We show the benefits of using compiler semantics and cross-checking specifications with different verification techniques, and call for standardizing proof library extensions to increase specification reuse. Second, a description of SEABMC - a novel BMC engine for SEAHORN. We start with a custom IR (called SEA-IR) that explicitly purifies all memory operations by explicating dependencies between them. We then run program transformations and allow for generating many different styles of verification conditions. To support memory safety checking, we extend our base approach with fat pointers and shadow bits of memory to keep track of metadata, such as the size of a pointed-to object. To evaluate SEABMC, we use the aws-c-common library from AWS as a benchmark and compare with CBMC, SMACK, and KLEE. We show that SEABMC is capable of providing an order of magnitude improvement compared with state-of-the-art. Third, a case study of extending SEABMC to work with Rust - a young systems programming language. We ask three research questions: (a) can SEABMC be used to verify Rust programs easily; (b) can the specification style of aws-c-common be applied successfully to Rust programs; and (c) can verification become more efficient when using higher level language information. We answer these questions by verifying aspects of the Rust standard library using SEAURCHIN, an extension of SEABMC for Rust

    Prioritized Unit Propagation and Extended Resolution Techniques for SAT Solvers

    Get PDF
    NP-complete problems like the Boolean Satisfiability (SAT) Problem are ubiquitous in computer science, mathematics, and engineering. Consequently, researchers have developed algorithms such as Conflict-Driven Clause-Learning (CDCL) SAT solvers, aimed at determining the satisfiability of Boolean formulas. As the result of decades of research in the development of CDCL SAT solvers, these algorithms solve real-life SAT instances surprisingly quickly, performing well despite the fact that the SAT problem is believed to be intractable in general. While modern CDCL SAT solvers are efficient for many real-world applications, there is continual demand for ever more powerful heuristics for newer applications. This demand in turn provides the impetus for research in solver heuristics. In this thesis, we address this need by proposing a new heuristic for Boolean Constraint Propagation (BCP), a key component of CDCL SAT solvers, and a novel, extensible, architectural design of an Extended Resolution (ER) SAT solver, a class of solvers that is more powerful than CDCL solvers. The impressive performance of CDCL SAT solvers on real-life Boolean instances is, in part, made possible by a combination of logical reasoning rules and heuristics integrated into different components of the solvers. Given that such combinations are currently the most successful paradigm in SAT solving, it is natural to ask how such combinations can be made even more efficient. We observe that there are two different approaches that can be taken to improve SAT solvers: one approach is to modify individual components within the SAT solving algorithm, and the other approach is to change the overall structure of the algorithm. We explore both approaches in this thesis. Following the first approach, we examine a critical component of CDCL: the Boolean Constraint Propagation (BCP) algorithm, which systematically finds implications of variable assignments made by the solver. In most implementations of BCP, variable values are propagated greedily -- the values of implied variables are set immediately after they are detected. This observation suggests that there could be a smarter way to perform BCP by prioritizing part of the search space rather than propagating implied variables immediately after they are encountered. In this work, we develop an algorithm which allows BCP to prioritize propagations, choose a heuristic priority ordering of the variables, and demonstrate a class of instances where our prioritized BCP algorithm, combined with this heuristic ordering, is able to outperform the traditional BCP algorithm. For the second approach, we note that solvers are fundamentally mathematical proof systems, and that CDCL produces proofs in the Resolution proof system, which is theoretically weaker than Extended Resolution (ER), a related proof system. Hence, it is natural to try integrating ER techniques into the CDCL algorithm, thus rendering it more powerful. However, it is well known that automating the ER proof system deterministically can be very challenging. Instead of proposing a single set of techniques to implement the ER proof system, we develop a programmatic framework (and an associated set of techniques) that enables one to upgrade CDCL solvers into an ER-based SAT solver. More precisely, we add three new major programmatic components: extension variable addition, extension variable substitution, and extension variable deletion. These components can be easily extended to test various ER ideas and heuristics. One of our considered heuristics is shown to be generally competitive with the baseline CDCL solver while improving upon the baseline for a specific class of cryptographic instances

    Formal Methods for Autonomous Systems

    Full text link
    Formal methods refer to rigorous, mathematical approaches to system development and have played a key role in establishing the correctness of safety-critical systems. The main building blocks of formal methods are models and specifications, which are analogous to behaviors and requirements in system design and give us the means to verify and synthesize system behaviors with formal guarantees. This monograph provides a survey of the current state of the art on applications of formal methods in the autonomous systems domain. We consider correct-by-construction synthesis under various formulations, including closed systems, reactive, and probabilistic settings. Beyond synthesizing systems in known environments, we address the concept of uncertainty and bound the behavior of systems that employ learning using formal methods. Further, we examine the synthesis of systems with monitoring, a mitigation technique for ensuring that once a system deviates from expected behavior, it knows a way of returning to normalcy. We also show how to overcome some limitations of formal methods themselves with learning. We conclude with future directions for formal methods in reinforcement learning, uncertainty, privacy, explainability of formal methods, and regulation and certification

    Understanding and Enhancing CDCL-based SAT Solvers

    Get PDF
    Modern conflict-driven clause-learning (CDCL) Boolean satisfiability (SAT) solvers routinely solve formulas from industrial domains with millions of variables and clauses, despite the Boolean satisfiability problem being NP-complete and widely regarded as intractable in general. At the same time, very small crafted or randomly generated formulas are often infeasible for CDCL solvers. A commonly proposed explanation is that these solvers somehow exploit the underlying structure inherent in industrial instances. A better understanding of the structure of Boolean formulas not only enables improvements to modern SAT solvers, but also lends insight as to why solvers perform well or poorly on certain types of instances. Even further, examining solvers through the lens of these underlying structures can help to distinguish the behavior of different solving heuristics, both in theory and practice. The first issue we address relates to the representation of SAT formulas. A given Boolean satisfiability problem can be represented in arbitrarily many ways, and the type of encoding can have significant effects on SAT solver performance. Further, in some cases, a direct encoding to SAT may not be the best choice. We introduce a new system that integrates SAT solving with computer algebra systems (CAS) to address representation issues for several graph-theoretic problems. We use this system to improve the bounds on several finitely-verified conjectures related to graph-theoretic problems. We demonstrate how our approach is more appropriate for these problems than other off-the-shelf SAT-based tools. For more typical SAT formulas, a better understanding of their underlying structural properties, and how they relate to SAT solving, can deepen our understanding of SAT. We perform a largescale evaluation of many of the popular structural measures of formulas, such as community structure, treewidth, and backdoors. We investigate how these parameters correlate with CDCL solving time, and whether they can effectively be used to distinguish formulas from different domains. We demonstrate how these measures can be used as a means to understand the behavior of solvers during search. A common theme is that the solver exhibits locality during search through the lens of these underlying structures, and that the choice of solving heuristic can greatly influence this locality. We posit that this local behavior of modern SAT solvers is crucial to their performance. The remaining contributions dive deeper into two new measures of SAT formulas. We first consider a simple measure, denoted “mergeability,” which characterizes the proportion of input clauses pairs that can resolve and merge. We develop a formula generator that takes as input a seed formula, and creates a sequence of increasingly more mergeable formulas, while maintaining many of the properties of the original formula. Experiments over randomly-generated industrial-like instances suggest that mergeability strongly negatively correlates with CDCL solving time, i.e., as the mergeability of formulas increases, the solving time decreases, particularly for unsatisfiable instances. Our final contribution considers whether one of the aforementioned measures, namely backdoor size, is influenced by solver heuristics in theory. Starting from the notion of learning-sensitive (LS) backdoors, we consider various extensions of LS backdoors by incorporating different branching heuristics and restart policies. We introduce learning-sensitive with restarts (LSR) backdoors and show that, when backjumping is disallowed, LSR backdoors may be exponentially smaller than LS backdoors. We further demonstrate that the size of LSR backdoors are dependent on the learning scheme used during search. Finally, we present new algorithms to compute upper-bounds on LSR backdoors that intrinsically rely upon restarts, and can be computed with a single run of a SAT solver. We empirically demonstrate that this can often produce smaller backdoors than previous approaches to computing LS backdoors

    Motion planning and control: a formal methods approach

    Get PDF
    Control of complex systems satisfying rich temporal specification has become an increasingly important research area in fields such as robotics, control, automotive, and manufacturing. Popular specification languages include temporal logics, such as Linear Temporal Logic (LTL) and Computational Tree Logic (CTL), which extend propositional logic to capture the temporal sequencing of system properties. The focus of this dissertation is on the control of high-dimensional systems and on timed specifications that impose explicit time bounds on the satisfaction of tasks. This work proposes and evaluates methods and algorithms for synthesizing provably correct control policies that deal with the scalability problems. Ideas and tools from formal verification, graph theory, and incremental computing are used to synthesize satisfying control strategies. Finite abstractions of the systems are generated, and then composed with automata encoding the specifications. The first part of this dissertation introduces a sampling-based motion planning algorithm that combines long-term temporal logic goals with short-term reactive requirements. The specification has two parts: (1) a global specification given as an LTL formula over a set of static service requests that occur at the regions of a known environment, and (2) a local specification that requires servicing a set of dynamic requests that can be sensed locally during the execution. The proposed computational framework consists of two main ingredients: (a) an off-line sampling-based algorithm for the construction of a global transition system that contains a path satisfying the LTL formula, and (b) an on-line sampling-based algorithm to generate paths that service the local requests, while making sure that the satisfaction of the global specification is not affected. The second part of the dissertation focuses on stochastic systems with temporal and uncertainty constraints. A specification language called Gaussian Distribution Temporal Logic is introduced as an extension of Boolean logic that incorporates temporal evolution and noise mitigation directly into the task specifications. A sampling-based algorithm to synthesize control policies is presented that generates a transition system in the belief space and uses local feedback controllers to break the curse of history associated with belief space planning. Switching control policies are then computed using a product Markov Decision Process between the transition system and the Rabin automaton encoding the specification.The approach is evaluated in experiments using a camera network and ground robot. The third part of this dissertation focuses on control of multi-vehicle systems with timed specifications and charging constraints. A rich expressivity language called Time Window Temporal Logic (TWTL) that describes time bounded specifications is introduced. The temporal relaxation of TWTL formulae with respect to the deadlines of tasks is also discussed. The key ingredient of the solution is an algorithm to translate a TWTL formula to an annotated finite state automaton that encodes all possible temporal relaxations of the given formula. The annotated automata are composed with transition systems encoding the motion of all vehicles, and with charging models to produce control strategies for all vehicles such that the overall system satisfies the mission specification. The methods are evaluated in simulation and experimental trials with quadrotors and charging stations

    VLSI Design

    Get PDF
    This book provides some recent advances in design nanometer VLSI chips. The selected topics try to present some open problems and challenges with important topics ranging from design tools, new post-silicon devices, GPU-based parallel computing, emerging 3D integration, and antenna design. The book consists of two parts, with chapters such as: VLSI design for multi-sensor smart systems on a chip, Three-dimensional integrated circuits design for thousand-core processors, Parallel symbolic analysis of large analog circuits on GPU platforms, Algorithms for CAD tools VLSI design, A multilevel memetic algorithm for large SAT-encoded problems, etc

    Fifteenth Biennial Status Report: March 2019 - February 2021

    Get PDF
    corecore