168 research outputs found

    Synthesis and Optimization of Reversible Circuits - A Survey

    Full text link
    Reversible logic circuits have been historically motivated by theoretical research in low-power electronics as well as practical improvement of bit-manipulation transforms in cryptography and computer graphics. Recently, reversible circuits have attracted interest as components of quantum algorithms, as well as in photonic and nano-computing technologies where some switching devices offer no signal gain. Research in generating reversible logic distinguishes between circuit synthesis, post-synthesis optimization, and technology mapping. In this survey, we review algorithmic paradigms --- search-based, cycle-based, transformation-based, and BDD-based --- as well as specific algorithms for reversible synthesis, both exact and heuristic. We conclude the survey by outlining key open challenges in synthesis of reversible and quantum logic, as well as most common misconceptions.Comment: 34 pages, 15 figures, 2 table

    Design for testability method at register transfer level

    Get PDF
    The testing of sequential circuit is more complex compared to combinational circuit because it needs a sequence of vectors to detect a fault. Its test cost increases with the complexity of the sequential circuit-under-test (CUT). Thus, design for testability (DFT) concept has been introduced to reduce testing complexity, as well as to improve testing effectiveness and efficiency. Scan technique is one of the mostly used DFT method. However, it has cost overhead in terms of area due to the number of added multiplexers for each flip-flop, and test application time due to shifting of test patterns. This research is motivated to introduce non-scan DFT method at register transfer level (RTL) in order to reduce test cost. DFT at RTL level is done based on functional information of the CUT and the connectivity of CUT registers. The process of chaining a register to another register is more effective in terms of area overhead and test application time. The first contribution of this work is the introduction of a non-scan DFT method at the RTL level that considers the information of controllability and observability of CUT that can be extracted from RTL description. It has been proven through simulation that the proposed method has higher fault coverage of around 90%, shorter test application time, shorter test generation time and 10% reduction in area overhead compared to other methods in literature for most benchmark circuits. The second contribution of this work is the introduction of built-in self-test (BIST) method at the RTL level which uses multiple input signature registers (MISRs) as BIST components instead of concurrent built-in logic block observers (CBILBOs). The selection of MISR as test register is based on extended minimum feedback vertex set algorithm. This new BIST method results in lower area overhead by about 32.9% and achieves similar higher fault coverage compared to concurrent BIST method. The introduction of non-scan DFT at the RTL level is done before logic synthesis process. Thus, the testability violations can be fixed without repeating the logic synthesis process during DFT insertion at the RTL level

    A framework for fine-grain synthesis optimization of operational amplifiers

    Get PDF
    This thesis presents a cell-level framework for Operational Amplifiers Synthesis (OASYN) coupling both circuit design and layout. For circuit design, the tool applies a corner-driven optimization, accounting for on-chip performance variations. By exploring the process, voltage, and temperature variations space, the tool extracts design worst case solution. The tool undergoes sensitivity analysis along with Pareto-optimality to achieve required specifications. For layout phase, OASYN generates a DRC proved automated layout based on a sized circuit-level description. Morata et al. (1996) introduced an elegant representation of block placement called sequence pair for general floorplans (SP). Like TCG and BSG, but unlike O-tree, B*tree, and CBL, SP is P-admissible. Unlike SP, TCG supports incremental update during operation and keeps the information of the boundary modules as well as their relative positions in the representation. Block placement algorithms that are based on SP use heuristic optimization algorithms, e.g., simulated annealing where generation of large number of sequence pairs are required. Therefore a fast algorithm is needed to generate sequence pairs after each solution perturbation. The thesis presents a new simple and efficient O(n) runtime algorithm for fast realization of incremental update for cost evaluation. The algorithm integrates sequence pair and transitive closure graph advantages into TCG-S* a superior topology update scheme which facilitates the search for optimum desired floorplan. Experiments show that TCG-S* is better than existing works in terms of area utilization and convergence speed. Routing-aware placement is implemented in OASYN, handling symmetry constraints, e.g., interdigitization, common centroid, along with congestion elimination and the enhancement of placement routability

    Submicron Systems Architecture Project: Semiannual Technial Report

    Get PDF
    No abstract available

    A SAT Based Test Generation Method for Delay Fault Testing of Macro Based Circuits

    Full text link

    Symbolic Model Checking of Concurrent Programs Using Partial Orders and On-the-Fly Transactions

    Full text link
    Abstract. The state explosion problem is one of the core bottlenecks in the model checking of concurrent software. We show how to ameliorate the problem by combining the ability of partial order techniques to reduce the state space of the concurrent program with the power of symbolic model checking to explore large state spaces. Our new verification methodology involves translating the given concurrent program into a circuit-based model which gives us the flexibility to then employ any model checking technique of choice – either SAT or BDD-based – for verifying a broad range of linear time properties, not just safety. The reduction in the explored state-space is obtained by statically augmenting the symbolic encoding of the program by additional constraints. These constraints restrict the scheduler to choose from a minimal conditional stubborn set of transitions at each state. Another key contribution of the paper, is a new method for detecting transactions on-the-fly which takes into account patterns of lock acquisition and yields better reductions than existing methods which rely on a lockset based analysis. Moreover unlike existing techniques, identifying on-the-fly transactions does not require the program to follow a lock discipline in accessing shared variables. We have applied our techniques to the Daisy test bench and shown the existence of several bugs.

    Design, Analysis and Test of Logic Circuits under Uncertainty.

    Full text link
    Integrated circuits are increasingly susceptible to uncertainty caused by soft errors, inherently probabilistic devices, and manufacturing variability. As device technologies scale, these effects become detrimental to circuit reliability. In order to address this, we develop methods for analyzing, designing, and testing circuits subject to probabilistic effects. Our main contributions are: 1) a fast, soft-error rate (SER) analyzer that uses functional-simulation signatures to capture error effects, 2) novel design techniques that improve reliability using little area and performance overhead, 3) a matrix-based reliability-analysis framework that captures many types of probabilistic faults, and 4) test-generation/compaction methods aimed at probabilistic faults in logic circuits. SER analysis must account for the main error-masking mechanisms in ICs: logic, timing, and electrical masking. We relate logic masking to node testability of the circuit and utilize functional-simulation signatures, i.e., partial truth tables, to efficiently compute estability (signal probability and observability). To account for timing masking, we compute error-latching windows (ELWs) from timing analysis information. Electrical masking is incorporated into our estimates through derating factors for gate error probabilities. The SER of a circuit is computed by combining the effects of all three masking mechanisms within our SER analyzer called AnSER. Using AnSER, we develop several low-overhead techniques that increase reliability, including: 1) an SER-aware design method that uses redundancy already present within the circuit, 2) a technique that resynthesizes small logic windows to improve area and reliability, and 3) a post-placement gate-relocation technique that increases timing masking by decreasing ELWs. We develop the probabilistic transfer matrix (PTM) modeling framework to analyze effects beyond soft errors. PTMs are compressed into algebraic decision diagrams (ADDs) to improve computational efficiency. Several ADD algorithms are developed to extract reliability and error susceptibility information from PTMs representing circuits. We propose new algorithms for circuit testing under probabilistic faults, which require a reformulation of existing test techniques. For instance, a test vector may need to be repeated many times to detect a fault. Also, different vectors detect the same fault with different probabilities. We develop test generation methods that account for these differences, and integer linear programming (ILP) formulations to optimize test sets.Ph.D.Computer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/61584/1/smita_1.pd
    corecore