23 research outputs found
Leveraging Groebner bases and SAT for hardware/software verification
Non UBCUnreviewedAuthor affiliation: University of UtahFacult
Recommended from our members
An infrastructure for RTL validation and verification
With the increase in size and complexity of digital designs, it has become imperative to address critical validation and verification issues at early stages of the design cycle. This requires robust, automated verification tools at higher (behavioural or register-transfer) level of abstraction. This dissertation describes tools and techniques to assist validation and symbolic verification of high-level or RTL descriptions of digital designs. In particular, a comprehensive infrastructure has been developed that assists in: (i) validation of the descriptions via simulation, and (ii) their functional equivalence verification. A prototype system has been developed around a hardware description language compiler in order to automate the process of validation and verification of RTL descriptions. The validation part of the infrastructure consists of Satisfiability (SAT) solvers based on Binary Decision Diagrams (BDD) that have been developed to automatically generate functional vectors to simulate the design. BDD-based SAT solvers suffer from the memory explosion problem. To overcome this limitation, two SAT solvers have been developed that employ the elements of the unate recursive paradigm to control the growth of BDD-size while quickly searching for solutions. Experiments carried out over a wide range of designs—ranging from random Boolean logic to regular array structures such as multipliers and shifters—demonstrate the robustness of these techniques. The verification part of the framework consists of equivalence checking tools that can verify the equivalence of RTL descriptions of digital designs. RTL descriptions represent high-level computations in abstract, symbolic forms from which low-level (binary) details are difficult to extract; the implementation details of logic blocks are not always available. Contemporary canonic representations do not have the scalability or the versatility to efficiently represent RTL descriptions in compact form. For this reason, a new representation called Taylor Expansion Diagrams (TED) has been developed to assist in functional equivalence verification of high-level descriptions of digital designs. TEDs are a compact, canonical, graph-based representation that are based upon a general non-binary decomposition principle using the Taylor series expansion. RTL computations are viewed as polynomials of a finite degree and TEDs are constructed for them. A set of reduction rules are applied to the diagram to make it canonical. TEDs also have the power to represent word-level algebraic computations in abstract symbolic form that allows to efficiently solve the equivalence checking problem for digital designs. The theoretical fundamentals behind TEDs are discussed and their efficient implementation is described. The robustness of the TED representation is analyzed by carrying out equivalence verification experiments over both equivalent and non-equivalent designs. It is shown that TEDs are exceptionally suitable for verifying large designs that contain not only algebraic (arithmetic) datapaths, but also model their interaction with Boolean variables
Guiding CNF-SAT Search by Analyzing Constraint-Variable Dependencies and Clause Lengths
The type of decision strategies employed for CNF-SAT have a profound effect on the efficiency and performance of SAT engines. Over the years, a variety of decision heuristics have been proposed; each has its own achievements and limitations. This paper re-visits the issue of decision heuristics and engineers a new approach that takes an integrated view of the overall problem structure. Our approach qualitatively analyzes clause-variable dependencies by accounting for variable/literal activity, clause connectivity, distribution of variables among clauses of different lengths, and correlation among variables, to derive an initial static ordering for SAT search. To account for conflict clauses and their resolution, a corresponding dynamic variable order update strategy is also presented. Quantitative metrics are proposed that are used to devise an algorithmic approach to guide overall SAT search. Experimental results demonstrate that our strategy significantly outperforms conventional approaches
Dynamic Analysis of Constraint-Variable Dependencies to Guide SAT Diagnosis
An important aspect of the Boolean Satisfiability problem is to derive an ordering of variables such that branching on that order results in a faster, more efficient search. Contemporary techniques employ either variable-activity or clause-connectivity based heuristics, but not both, to guide the search. This paper advocates for simultaneous analysis of variable-activity and clause-connectivity to derive an order for SAT search. Preliminary results demonstrate that the variable order derived by our approach can significantly expedite the search. As the search proceeds, clause database is updated due to added conflict clauses. Therefore, the variable activity and connectivity information changes dynamically. Our technique analyzes this information and re-computes the variable order whenever the search is restarted. Preliminary experiments show that such a dynamic analysis of constraint-variable relationships significantly improves the performance of the SAT solvers. Our technique is very fast and this analysis time is a negligible (in milliseconds) even for instances that contain a large number of variables and constraints. This paper presents preliminary experiments, analyzes the results and comments upon future research directions