4 research outputs found
Handling Discontinuous Effects in Modeling Spatial Correlation of Wafer-level Analog/RF Tests
Abstract-In an effort to reduce the cost of specification testing in analog/RF circuits, spatial correlation modeling of wafer-level measurements has recently attracted increased attention. Existing approaches for capturing and leveraging such correlation, however, rely on the assumption that spatial variation is smooth and continuous. This, in turn, limits the effectiveness of these methods on actual production data, which often exhibits localized spatial discontinuous effects. In this work, we propose a novel approach which enables spatial correlation modeling of waferlevel analog/RF tests to handle such effects and, thereby, to drastically reduce prediction error for measurements exhibiting discontinuous spatial patterns. The core of the proposed approach is a k-means algorithm which partitions a wafer into k clusters, as caused by discontinuous effects. Individual correlation models are then constructed within each cluster, revoking the assumption that spatial patterns should be smooth and continuous across the entire wafer. Effectiveness of the proposed approach is evaluated on industrial probe test data from more than 3,400 wafers, revealing significant error reduction over existing approaches
Recommended from our members
Data Analytics in Test: Recognizing and Reducing Subjectivity
Applying data analytics in production test has become a widely adopted industrial practice in recent years. As the complexity of semiconductor devices scales and the amounts of available test data continue to grow, the research direction in this field is forced to shift away from solving specific problems with ad hoc approaches and demands for deeper understanding of the fundamental issues. Two data-driven test applications where this shift is apparent are production yield optimization and defect screening, where the respective underlying data analytics approaches are correlation analysis and outlier analysis. A core issue present in these two approaches stems from the subjectivity that is inherent to data analytics. This dissertation delves into how subjectivity manifests itself and what can be done to reduce it with respect to the two test applications.Outlier analysis is an approach used for identifying anomalies. The main goal of outlier analysis in test is to capture statistically outlying parts with the hope that their abnormal behavior is attributed to some defectivity. During creation of an outlier model, the decisions about outlying behavior in the existing data are made by utilizing known failures and the test engineer's best judgment. In practice, outlier screening methods are simply used for transforming data into an outlier score space. Even if outlier analysis techniques are able to successfully classify a dataset into inliers and outliers, outlier models require thresholds to be decided. A concept called Consistency is introduced to provide an objective data-driven way to evaluate outlier models by utilizing all available data. The key observation underlying this concept is that outlier analysis should be immune to noise introduced by sources of systematic variation.Correlation analysis is a process comprising a search for related variables. The application of production yield optimization involves searching for correlation between the yield and various controllable parameters. The goal of this process is to uncover parameters that, when adjusted, can result in yield improvement. This analytics process is subjective to the perspective of the analyst and the quality of the result is highly dependent on the analyst’s previous experiences. In order to reduce the subjectivity in this application, a process mining methodology is introduced to learn from the experiences of analysts. The key advantage of this methodology is that in addition to having the capability to record and reproduce these analyses, it can also generalize to analytics processes not contained in the learned experiences
Test cost reduction through performance prediction using virtual probe
Abstract — The virtual probe (VP) technique, based on recent breakthroughs in compressed sensing, has demonstrated its ability for accurate prediction of spatial variations from a small set of measurement data. In this paper, we explore its application to cost reduction of production testing. For a number of test items, the measurement data from a small subset of chips can be used to accurately predict the performance of other chips on the same wafer without explicit measurement. Depending on their statistical characteristics, test items can be classified into three categories: highly predictable, predictable, and un-predictable. A case study of an industrial RF radio transceiver with more than 50 production test items shows that a good fraction of these test items (39 out of 51 items) are predictable or highly predictable. In this example, the 3σ error of VP prediction is less than 12 % for predictable or highly predictable test items. Applying the VP technique can on average replace 59 % of test measurement by prediction and, consequently, reduce the overall test time b