2 research outputs found

    Reducing Library Characterization Time for Cell-aware Test while Maintaining Test Quality

    Get PDF
    Cell-aware test (CAT) explicitly targets faults caused by defects inside library cells to improve test quality, compared with conventional automatic test pattern generation (ATPG) approaches, which target faults only at the boundaries of library cells. The CAT methodology consists of two stages. Stage 1, based on dedicated analog simulation, library characterization per cell identifies which cell-level test pattern detects which cell-internal defect; this detection information is encoded in a defect detection matrix (DDM). In Stage 2, with the DDMs as inputs, cell-aware ATPG generates chip-level test patterns per circuit design that is build up of interconnected instances of library cells. This paper focuses on Stage 1, library characterization, as both test quality and cost are determined by the set of cell-internal defects identified and simulated in the CAT tool flow. With the aim to achieve the best test quality, we first propose an approach to identify a comprehensive set, referred to as full set, of potential open- and short-defect locations based on cell layout. However, the full set of defects can be large even for a single cell, making the time cost of the defect simulation in Stage 1 unaffordable. Subsequently, to reduce the simulation time, we collapse the full set to a compact set of defects which serves as input of the defect simulation. The full set is stored for the diagnosis and failure analysis. With inspecting the simulation results, we propose a method to verify the test quality based on the compact set of defects and, if necessary, to compensate the test quality to the same level as that based on the full set of defects. For 351 combinational library cells in Cadence’s GPDK045 45nm library, we simulate only 5.4% defects from the full set to achieve the same test quality based on the full set of defects. In total, the simulation time, via linear extrapolation per cell, would be reduced by 96.4% compared with the time based on the full set of defects

    Efficient algorithms for fundamental statistical timing analysis problems in delay test applications of VLSI circuits

    Get PDF
    Tremendous advances in semiconductor process technology are creating new challenges for the delay test of today’s digital VLSI circuits. The complexity of state-of-the-art manufacturing processes does not only lead to greater process variability, it also makes today's integrated circuits more prone to defects such as resistive shorts and opens. As a consequence, some of the manufactured circuits do not meet the timing requirements set by the design specification. These circuits must be identified by delay testing and sorted out to ensure the quality of shipped products. Due to the increasing process variability, key transistor and interconnect parameters must be modelled as random variables. These random variables capture the uncertainty caused by process variability, but also the impact of modelling errors and variations in the operating conditions of the circuits, such as the temperature or the supply voltage. The important consequence for delay testing is that a particular delay test detects a delay fault of fixed size in only a subset of all manufactured circuits, which inevitably leads to the shipment of defective products. Despite the fact that this problem is well understood, today's delay test generation methods are unable to consider the distortion of the delay test results, caused by process variability. To analyse and predict the effectiveness of delay tests in a population of circuits which are functionally identical but have varying timing properties, statistical timing analysis is necessary. Although the large runtime of statistical timing analysis is a well known problem, little progress has been made in the development of efficient statistical timing analysis algorithms for the variability-aware delay test generation and delay fault simulation. This dissertation proposes novel and efficient statistical timing analysis algorithms for the variability-aware delay test generation and delay fault simulation in presence of large delay variations. For the detection of path delay faults, a novel probabilistic sensitization analysis is presented which analyses the impact of process variations on the sensitization of the target paths. Furthermore, an efficient method for approximating the probability of detecting small delay faults is presented. Beyond that, efficient statistical SUM and MAX-operations are proposed, which provide the fundamental basis of block-based statistical timing analysis. The experiment results demonstrate the high efficiency of the proposed algorithms
    corecore