7 research outputs found

    Integrated circuit outlier identification by multiple parameter correlation

    Get PDF
    Semiconductor manufacturers must ensure that chips conform to their specifications before they are shipped to customers. This is achieved by testing various parameters of a chip to determine whether it is defective or not. Separating defective chips from fault-free ones is relatively straightforward for functional or other Boolean tests that produce a go/no-go type of result. However, making this distinction is extremely challenging for parametric tests. Owing to continuous distributions of parameters, any pass/fail threshold results in yield loss and/or test escapes. The continuous advances in process technology, increased process variations and inaccurate fault models all make this even worse. The pass/fail thresholds for such tests are usually set using prior experience or by a combination of visual inspection and engineering judgment. Many chips have parameters that exceed certain thresholds but pass Boolean tests. Owing to the imperfect nature of tests, to determine whether these chips (called "outliers") are indeed defective is nontrivial. To avoid wasted investment in packaging or further testing it is important to screen defective chips early in a test flow. Moreover, if seemingly strange behavior of outlier chips can be explained with the help of certain process parameters or by correlating additional test data, such chips can be retained in the test flow before they are proved to be fatally flawed. In this research, we investigate several methods to identify true outliers (defective chips, or chips that lead to functional failure) from apparent outliers (seemingly defective, but fault-free chips). The outlier identification methods in this research primarily rely on wafer-level spatial correlation, but also use additional test parameters. These methods are evaluated and validated using industrial test data. The potential of these methods to reduce burn-in is discussed

    Variance reduction and outlier identification for IDDQ testing of integrated chips using principal component analysis

    Get PDF
    Integrated circuits manufactured in current technology consist of millions of transistors with dimensions shrinking into the nanometer range. These small transistors have quiescent (leakage) currents that are increasingly sensitive to process variations, which have increased the variation in good-chip quiescent current and consequently reduced the effectiveness of IDDQ testing. This research proposes the use of a multivariate statistical technique known as principal component analysis for the purpose of variance reduction. Outlier analysis is applied to the reduced leakage current values as well as the good chip leakage current estimate, to identify defective chips. The proposed idea is evaluated using IDDQ values from multiple wafers of an industrial chip fabricated in 130 nm technology. It is shown that the proposed method achieves significant variance reduction and identifies many outliers that escape identification by other established techniques. For example, it identifies many of the absolute outliers in bad neighborhoods, which are not detected by Nearest Neighbor Residual and Nearest Current Ratio. It also identifies many of the spatial outliers that pass when using Current Ratio. The proposed method also identifies both active and passive defects

    A Case-Based Reasoning Method for Remanufacturing Process Planning

    Get PDF
    Remanufacturing is a practice of growing importance due to its increasing environmental and economic benefits. Process planning plays a critical role in realizing a successful remanufacturing strategy. This paper presents a case-based reasoning method for remanufacturing process planning, which allows a process planner to rapidly retrieve, reuse, revise, and retain the solutions to past process problems. In the proposed method, influence factors including essential characteristics, failure characteristics, and remanufacturing processing characteristics are identified, and the local similarity of influence factors between the new case and the past cases is determined by nearest neighbor matching method, and then the vector of correction factor for local similarity is utilized in the nearest neighbor algorithm to improve the accuracy and effectiveness of case searching. To assess the usefulness and practicality of the proposed method, an illustrative example is given and the results are discussed

    Reliability Analysis of Nanocrystal Embedded High-k Nonvolatile Memories

    Get PDF
    The evolution of the MOSFET technology has been driven by the aggressive shrinkage of the device size to improve the device performance and to increase the circuit density. Currently, many research demonstrated that the continuous polycrystalline silicon film in the floating-gate dielectric could be replaced with nanocrystal (nc) embedded high-k thin film to minimize the charge loss due to the defective thin tunnel dielectric layer. This research deals with both the statistical aspect of reliability and electrical aspect of reliability characterization as well. In this study, the Zr-doped HfO2 (ZrHfO) high-k MOS capacitors, which separately contain the nanocrystalline zinc oxide (nc-ZnO), silicon (nc-Si), Indium Tin Oxide (nc-ITO) and ruthenium (nc-Ru) are studied on their memory properties, charge transportation mechanism, ramp-relax test, accelerated life tests, failure rate estimation and thermal effect on the above reliability properties. C-V hysteresis result show that the amount of charges trapped in nanocrystal embedded films is in the order of nc-ZnO\u3enc-Ru\u3enc-Si~nc-ITO, which might probably be influenced by the EOT of each sample. In addition, all the results show that the nc-ZnO embedded ZrHfO non-volatile memory capacitor has the best memory property and reliability. In this study, the optimal burn-in time for this kind of device has been also investigated with nonparametric Bayesian analysis. The results show the optimal burn-in period for nc-ZnO embedded high-k device is 5470s with the maximum one-year mission reliability

    EXPLOITATION OF SMALL INTERFERING RNA METHODOLOGY TO IDENTIFY NOVEL ANTICANCER TREATMENTS

    Get PDF
    The majority of current pharmacological treatments for cancer target rapidly dividing cells, a characteristic of most cancer cells. Unfortunately, these treatments also affect cells that normally divide at a rapid rate, such as cells of the digestive tract, hair follicles, and bone marrow, which limits the efficacy of chemotherapy due to toxic side effects. Reducing the drug dose to evade these side effects, however, often impairs efficacy and encourages drug resistance. Therefore, new unbiased approaches are required to identify new drug combinations with existing effective cancer chemotherapeutics. I therefore exploited data from a short interfering RNA (siRNA) high throughput screen targeting 5,520 unique druggable genes, which comprises gene products that are theoretically good targets for drug development. I used the siRNA screening methodology to identify novel combination chemotherapies for the treatment of glioblastoma multiforme (GBM), the most common and aggressive form of human primary brain tumors. My hypothesis is that unrecognized chemosensitivity nodes exist for the microtubule destabilizing agent vinblastine. GBM cells were treated with a sub-lethal concentration of vinblastine and identified gene products that sensitized cells to vinblastine. Using a series of statistical methods, followed by target identification assays, I found gene products that sensitized GBM cells to vinblastine, implicating siRNA screening technology as an efficient, unbiased method for identifying potentially novel anticancer treatments

    Essays on Estimation of Inflation Equation

    Get PDF
    This dissertation improves upon the estimation of inflation equation, using the ad- ditional measures of distribution of price changes and the optimum choice of instru- mental variables. The measures of dispersion and skewness of the cross-sectional distribution of price changes have been used in empirical analysis of inflation. In the first essay, we find that independent kurtosis effect can have a significant role in the approximation of inflation rate in addition to the dispersion and skewness. The kurtosis measure can improve the approximation of inflation in terms of goodness of fit. The second essay complements the first essay. It is well known that classical measures of moments are sensitive to outliers. It examines the presence of outliers in relative price changes and consider several robust alternative measures of dispersion and skewness. We find the significant relationship between inflation and robust mea- sures of dispersion and skewness. In particular, medcouple as a measure of skewness is very useful in predicting inflation. The third essay estimates the Hybrid Phillips Curve using the optimal set of instrumental variables. Instrumental variables are usually selected from a large number of valid instruments on an ad hoc basis. It has been recognized in the literature that the estimates are sensitive to the choice of instrumental variables and to the choice of the measurement of inflation. This paper uses the L2-boosting method that selects the best instruments from a large number of valid weakly exogenous instruments. We find that boosted instruments produce more comparable estimates of parameters across different measures of inflation and a higher joint precision of the estimates. Instruments boosted from principal compo- nents tend to give a little better results than the instruments from observed variables, but no significant difference is found between the ordinary and generalized principal components

    Evaluation of Effectiveness of Median of Absolute Deviations Outlier Rejection-based IDDQ Testing for Burn-in Reduction

    No full text
    CMOS chips having high leakage are observed to have high burn-in fallout rate. IDDQ testing has been considered as an alternative to burn-in. However, increased subthreshold leakage current in deep sub-micron technologies limits the use of IDDQ testing in its present form. In this work, a statistical outlier rejection technique known as the median of absolute deviations (MAD) is evaluated as a means to screen early failures using IDDQ data. MAD is compared with delta IDDQ and current signature methods. The results of the analysis of the SEMATECH data are presented. 1
    corecore