48 research outputs found

    Boolean Satisfiability in Electronic Design Automation

    No full text
    Boolean Satisfiability (SAT) is often used as the underlying model for a significant and increasing number of applications in Electronic Design Automation (EDA) as well as in many other fields of Computer Science and Engineering. In recent years, new and efficient algorithms for SAT have been developed, allowing much larger problem instances to be solved. SAT “packages” are currently expected to have an impact on EDA applications similar to that of BDD packages since their introduction more than a decade ago. This tutorial paper is aimed at introducing the EDA professional to the Boolean satisfiability problem. Specifically, we highlight the use of SAT models to formulate a number of EDA problems in such diverse areas as test pattern generation, circuit delay computation, logic optimization, combinational equivalence checking, bounded model checking and functional test vector generation, among others. In addition, we provide an overview of the algorithmic techniques commonly used for solving SAT, including those that have seen widespread use in specific EDA applications. We categorize these algorithmic techniques, indicating which have been shown to be best suited for which tasks

    Boolean satisfiability in electronic design automation

    Full text link

    Coverage of Compositional Property Sets for Hardware and Hardware-dependent Software in Formal System-on-Chip Verification

    Get PDF
    Divide-and-Conquer is a common strategy to manage the complexity of system design and verification. In the context of System-on-Chip (SoC) design verification, an SoC system is decomposed into several modules and every module is separately verified. Usually an SoC module is reactive: it interacts with its environmental modules. This interaction is normally modeled by environment constraints, which are applied to verify the SoC module. Environment constraints are assumed to be always true when verifying the individual modules of a system. Therefore the correctness of environment constraints is very important for module verification. Environment constraints are also very important for coverage analysis. Coverage analysis in formal verification measures whether or not the property set fully describes the functional behavior of the design under verification (DuV). if a set of properties describes every functional behavior of a DuV, the set of properties is called complete. To verify the correctness of environment constraints, Assume-Guarantee Reasoning rules can be employed. However, the state of the art assume-guarantee reasoning rules cannot be applied to the environment constraints specified by using an industrial standard property language such as SystemVerilog Assertions (SVA). This thesis proposes a new assume-guarantee reasoning rule that can be applied to environment constraints specified by using a property language such as SVA. In addition, this thesis proposes two efficient plausibility checks for constraints that can be conducted without a concrete implementation of the considered environment. Furthermore, this thesis provides a compositional reasoning framework determining that a system is completely verified if all modules are verified with Complete Interval Property Checking (C-IPC) under environment constraints. At present, there is a trend that more of the functionality in SoCs is shifted from the hardware to the hardware-dependent software (HWDS), which is a crucial component in an SoC, since other software layers, such as the operating systems are built on it. Therefore there is an increasing need to apply formal verification to HWDS, especially for safety-critical systems. The interactions between HW and HWDS are often reactive, and happen in a temporal order. This requires new property languages to specify the reactive behavior at the HW and SW interfaces. This thesis introduces a new property language, called Reactive Software Property Language (RSPL), to specify the reactive interactions between the HW and the HWDS. Furthermore, a method for checking the completeness of software properties, which are specified by using RSPL, is presented in this thesis. This method is motivated by the approach of checking the completeness of hardware properties

    New Data Structures and Algorithms for Logic Synthesis and Verification

    Get PDF
    The strong interaction between Electronic Design Automation (EDA) tools and Complementary Metal-Oxide Semiconductor (CMOS) technology contributed substantially to the advancement of modern digital electronics. The continuous downscaling of CMOS Field Effect Transistor (FET) dimensions enabled the semiconductor industry to fabricate digital systems with higher circuit density at reduced costs. To keep pace with technology, EDA tools are challenged to handle both digital designs with growing functionality and device models of increasing complexity. Nevertheless, whereas the downscaling of CMOS technology is requiring more complex physical design models, the logic abstraction of a transistor as a switch has not changed even with the introduction of 3D FinFET technology. As a consequence, modern EDA tools are fine tuned for CMOS technology and the underlying design methodologies are based on CMOS logic primitives, i.e., negative unate logic functions. While it is clear that CMOS logic primitives will be the ultimate building blocks for digital systems in the next ten years, no evidence is provided that CMOS logic primitives are also the optimal basis for EDA software. In EDA, the efficiency of methods and tools is measured by different metrics such as (i) the result quality, for example the performance of a digital circuit, (ii) the runtime and (iii) the memory footprint on the host computer. With the aim to optimize these metrics, the accordance to a specific logic model is no longer important. Indeed, the key to the success of an EDA technique is the expressive power of the logic primitives handling and solving the problem, which determines the capability to reach better metrics. In this thesis, we investigate new logic primitives for electronic design automation tools. We improve the efficiency of logic representation, manipulation and optimization tasks by taking advantage of majority and biconditional logic primitives. We develop synthesis tools exploiting the majority and biconditional expressiveness. Our tools show strong results as compared to state-of-the-art academic and commercial synthesis tools. Indeed, we produce the best results for several public benchmarks. On top of the enhanced synthesis power, our methods are the natural and native logic abstraction for circuit design in emerging nanotechnologies, where majority and biconditional logic are the primitive gates for physical implementation. We accelerate formal methods by (i) studying properties of logic circuits and (ii) developing new frameworks for logic reasoning engines. We prove non-trivial dualities for the property checking problem in logic circuits. Our findings enable sensible speed-ups in solving circuit satisfiability. We develop an alternative Boolean satisfiability framework based on majority functions. We prove that the general problem is still intractable but we show practical restrictions that can be solved efficiently. Finally, we focus on reversible logic where we propose a new equivalence checking approach. We exploit the invertibility of computation and the functionality of reversible gates in the formulation of the problem. This enables one order of magnitude speed up, as compared to the state-of-the-art solution. We argue that new approaches to solve EDA problems are necessary, as we have reached a point of technology where keeping pace with design goals is tougher than ever

    Doctor of Philosophy

    Get PDF
    dissertationWith the spread of internet and mobile devices, transferring information safely and securely has become more important than ever. Finite fields have widespread applications in such domains, such as in cryptography, error correction codes, among many others. In most finite field applications, the field size - and therefore the bit-width of the operands - can be very large. The high complexity of arithmetic operations over such large fields requires circuits to be (semi-) custom designed. This raises the potential for errors/bugs in the implementation, which can be maliciously exploited and can compromise the security of such systems. Formal verification of finite field arithmetic circuits has therefore become an imperative. This dissertation targets the problem of formal verification of hardware implementations of combinational arithmetic circuits over finite fields of the type F2k . Two specific problems are addressed: i) verifying the correctness of a custom-designed arithmetic circuit implementation against a given word-level polynomial specification over F2k ; and ii) gate-level equivalence checking of two different arithmetic circuit implementations. This dissertation proposes polynomial abstractions over finite fields to model and represent the circuit constraints. Subsequently, decision procedures based on modern computer algebra techniques - notably, Gr¨obner bases-related theory and technology - are engineered to solve the verification problem efficiently. The arithmetic circuit is modeled as a polynomial system in the ring F2k [x1, x2, · · · , xd], and computer algebrabased results (Hilbert's Nullstellensatz) over finite fields are exploited for verification. Using our approach, experiments are performed on a variety of custom-designed finite field arithmetic benchmark circuits. The results are also compared against contemporary methods, based on SAT and SMT solvers, BDDs, and AIG-based methods. Our tools can verify the correctness of, and detect bugs in, up to 163-bit circuits in F2163 , whereas contemporary approaches are infeasible beyond 48-bit circuits

    Sciduction: Combining Induction, Deduction, and Structure for Verification and Synthesis

    Full text link
    Even with impressive advances in automated formal methods, certain problems in system verification and synthesis remain challenging. Examples include the verification of quantitative properties of software involving constraints on timing and energy consumption, and the automatic synthesis of systems from specifications. The major challenges include environment modeling, incompleteness in specifications, and the complexity of underlying decision problems. This position paper proposes sciduction, an approach to tackle these challenges by integrating inductive inference, deductive reasoning, and structure hypotheses. Deductive reasoning, which leads from general rules or concepts to conclusions about specific problem instances, includes techniques such as logical inference and constraint solving. Inductive inference, which generalizes from specific instances to yield a concept, includes algorithmic learning from examples. Structure hypotheses are used to define the class of artifacts, such as invariants or program fragments, generated during verification or synthesis. Sciduction constrains inductive and deductive reasoning using structure hypotheses, and actively combines inductive and deductive reasoning: for instance, deductive techniques generate examples for learning, and inductive reasoning is used to guide the deductive engines. We illustrate this approach with three applications: (i) timing analysis of software; (ii) synthesis of loop-free programs, and (iii) controller synthesis for hybrid systems. Some future applications are also discussed

    Doctor of Philosophy

    Get PDF
    dissertationFormal verification of hardware designs has become an essential component of the overall system design flow. The designs are generally modeled as finite state machines, on which property and equivalence checking problems are solved for verification. Reachability analysis forms the core of these techniques. However, increasing size and complexity of the circuits causes the state explosion problem. Abstraction is the key to tackling the scalability challenges. This dissertation presents new techniques for word-level abstraction with applications in sequential design verification. By bundling together k bit-level state-variables into one word-level constraint expression, the state-space is construed as solutions (variety) to a set of polynomial constraints (ideal), modeled over the finite (Galois) field of 2^k elements. Subsequently, techniques from algebraic geometry -- notably, Groebner basis theory and technology -- are researched to perform reachability analysis and verification of sequential circuits. This approach adds a "word-level dimension" to state-space abstraction and verification to make the process more efficient. While algebraic geometry provides powerful abstraction and reasoning capabilities, the algorithms exhibit high computational complexity. In the dissertation, we show that by analyzing the constraints, it is possible to obtain more insights about the polynomial ideals, which can be exploited to overcome the complexity. Using our algorithm design and implementations, we demonstrate how to perform reachability analysis of finite-state machines purely at the word level. Using this concept, we perform scalable verification of sequential arithmetic circuits. As contemporary approaches make use of resolution proofs and unsatisfiable cores for state-space abstraction, we introduce the algebraic geometry analog of unsatisfiable cores, and present algorithms to extract and refine unsatisfiable cores of polynomial ideals. Experiments are performed to demonstrate the efficacy of our approaches

    Formal verification of automotive embedded UML designs

    Get PDF
    Software applications are increasingly dominating safety critical domains. Safety critical domains are domains where the failure of any application could impact human lives. Software application safety has been overlooked for quite some time but more focus and attention is currently directed to this area due to the exponential growth of software embedded applications. Software systems have continuously faced challenges in managing complexity associated with functional growth, flexibility of systems so that they can be easily modified, scalability of solutions across several product lines, quality and reliability of systems, and finally the ability to detect defects early in design phases. AUTOSAR was established to develop open standards to address these challenges. ISO-26262, automotive functional safety standard, aims to ensure functional safety of automotive systems by providing requirements and processes to govern software lifecycle to ensure safety. Each functional system needs to be classified in terms of safety goals, risks and Automotive Safety Integrity Level (ASIL: A, B, C and D) with ASIL D denoting the most stringent safety level. As risk of the system increases, ASIL level increases and the standard mandates more stringent methods to ensure safety. ISO-26262 mandates that ASILs C and D classified systems utilize walkthrough, semi-formal verification, inspection, control flow analysis, data flow analysis, static code analysis and semantic code analysis techniques to verify software unit design and implementation. Ensuring software specification compliance via formal methods has remained an academic endeavor for quite some time. Several factors discourage formal methods adoption in the industry. One major factor is the complexity of using formal methods. Software specification compliance in automotive remains in the bulk heavily dependent on traceability matrix, human based reviews, and testing activities conducted on either actual production software level or simulation level. ISO26262 automotive safety standard recommends, although not strongly, using formal notations in automotive systems that exhibit high risk in case of failure yet the industry still heavily relies on semi-formal notations such as UML. The use of semi-formal notations makes specification compliance still heavily dependent on manual processes and testing efforts. In this research, we propose a framework where UML finite state machines are compiled into formal notations, specification requirements are mapped into formal model theorems and SAT/SMT solvers are utilized to validate implementation compliance to specification. The framework will allow semi-formal verification of AUTOSAR UML designs via an automated formal framework backbone. This semi-formal verification framework will allow automotive software to comply with ISO-26262 ASIL C and D unit design and implementation formal verification guideline. Semi-formal UML finite state machines are automatically compiled into formal notations based on Symbolic Analysis Laboratory formal notation. Requirements are captured in the UML design and compiled automatically into theorems. Model Checkers are run against the compiled formal model and theorems to detect counterexamples that violate the requirements in the UML model. Semi-formal verification of the design allows us to uncover issues that were previously detected in testing and production stages. The methodology is applied on several automotive systems to show how the framework automates the verification of UML based designs, the de-facto standard for automotive systems design, based on an implicit formal methodology while hiding the cons that discouraged the industry from using it. Additionally, the framework automates ISO-26262 system design verification guideline which would otherwise be verified via human error prone approaches

    The verification of MDG algorithms in the HOL theorem prover

    Get PDF
    Formal verification of digital systems is achieved, today, using one of two main approaches: states exploration (mainly model checking and equivalence checking) or deductive reasoning (theorem proving). Indeed, the combination of the two approaches, states exploration and deductive reasoning promises to overcome the limitation and to enhance the capabilities of each. Our research is motivated by this goal. In this thesis, we provide the entire necessary infrastructure (data structure + algorithms) to define high level states exploration in the HOL theorem prover named as MDG-HOL platform. While related work has tackled the same problem by representing primitive Binary Decision Diagram (BDD) operations as inference rules added to the core of the theorem prover, we have based our approach on the Multiway Decision Graphs (MDGs). MDG generalizes ROBDD to represent and manipulate a subset of first-order logic formulae. With MDGs, a data value is represented by a single variable of an abstract type and operations on data are represented in terms of uninterpreted function. Considering MDGs instead of BDDs will raise the abstraction level of what can be verified using a state exploration within a theorem prover. The MDGs embedding is based on the logical formulation of an MDG as a Directed Formulae (DF). The DF syntax is defined as HOL built-in data types. We formalize the basic MDG operations using this syntax within HOL following a deep embedding approach. Such approach ensures the consistency of our embedding. Then, we derive the correctness proof for each MDG basic operator. Based on this platform, the MDG reachability analysis is defined in HOL as a conversion that uses the MDG theory within HOL. Then, we demonstrate the effectiveness of our platform by considering four case studies. Our obtained results show that this verification framework offers a considerable gain in terms of automation without sacrificing CPU time and memory usage compared to automatic model checker tools. Finally, we propose a reduction technique to improve MDGs model checking based on the MDG-HOL platform. The idea is to prune the transition relation of the circuits using pre-proved theorems and lemmas from the specification given at system level. We also use the consistency of the specifications to verify if the reduced model is faithful to the original one. We provide two case studies, the first one is the reduction using SAT-MDG of an Island Tunnel Controller and the second one is the MDG-HOL assume-guarantee reduction of the Look-Aside Interface. The obtained results of our approach offers a considerable gain in terms of heuristics and reduction techniques correctness as to commercial model checking; however a small penalty is paid in terms of CPU time and memory usag
    corecore