1,735 research outputs found

    High Confidence Testing for Instrumentation System-on-Chip with Unknown-Good-Yield

    Get PDF
    SoCs are in general built with embedded IP cores, each of which is procured from different IP providers with no prior information on known-good-yield (KGY). In practice, partial testing is a practical choice for assuring the yield of the product under the stringent time-to-market requirements. Therefore, a proper sampling technique is a key to high confidence testing and cost effectiveness. Based on previous research, this paper proposes a novel statistical testing technique for increasingly hybrid integrated systems fabricated on a single silicon die with no a-priori empirical yield data. This problem is referred to as the unknown-good-yield (UKGY) problem. The proposed testing method, referred to as regressive testing (RegT) in this paper, exploits another way around by using parameters (referred to as assistant variables (AV)) that are employed to evaluate the yields of randomly sampled SoCs and thereby estimating the good yield by using a regression analysis method with regard to confidence intervals. Numerous simulations are conducted to demonstrate the efficiency and effectiveness of the proposed RegT in comparison to characterization-based testing methods

    Environmental-Based Characterization of SoC-Based Instrumentation Systems for Stratified Testing

    Get PDF
    This paper proposes a novel environmental-based method for evaluating the good yield rate (GYR) of systems-on-chip (SoC) during fabrication. Testing and yield evaluation at high confidence are two of the most critical issues for the success of SoC as a viable technology. The proposed method relies on different features of fabrication, which are quantified by the so-called Fabrication environmental parameters (EPs). EPs can be highly correlated to the yield, so they are analyzed using statistical methods to improve its accuracy and ultimately direct the test process to an efficient execution. The novel contributions of the proposed method are: 1) to establish an adequate theoretical foundation for understanding the fabrication process of SoCs together with an assurance of the yield at a high confidence level and 2) to ultimately provide a realistic approach to SoC testing with an accurate yield evaluation. Simulations are provided to demonstrate that the proposed method significantly improves the confidence interval of the estimated yield as compared with existing testing methodologies such as random testing (RT)

    GOALI/IUCP: Prediction of Wood Pulp K-Number with Neural Networks

    Get PDF
    Lignin holds wood fibers together, and must be removed to produce high strength pulp for kraft paper. The Kappa- or K-number indicates the degree of lignin removal by a pulping process, and is probably the key variable for measuring quality in this process. A difficulty is that it is an off-line measurement. More importantly, there is usually a four hour process delay between when raw materials enter a pulping digester and when the K-number is measured. This makes modeling and control difficult. This Grant Opportunity for Academic Liaison with Industry project uses neural network models to predict K-number as a function of a number of more readily available process parameters. This is a first step in improving the control and responsiveness of this process to changes in chip feed stock. The research team from the University of Maine and S.D. Warren Company will develop characterization and prediction models using data from an operating plant, and compare their long term predicative capability when integrated into digester operations. Throughout, seminar and workshops are part of the technology transfer and model improvement. The impact of this research will be more uniform quality of pulp, even with variable feed stock, and more uniform quality in subsequent bleaching and papermaking processes

    Calibration and characterization of a low-cost wireless sensor for applications in CNC end milling

    Get PDF
    Central to creating a smart machining system is the challenge of collecting detailed information about the milling process at the tool tip. This work discusses the design, static calibration, dynamic characterization, and implementation of a low-cost wireless sensor for end-milling. Our novel strain-based sensor, called the Smart Tool, is shown to perform well in a laboratory setting with accuracy and dynamic behavior comparable to that of the Kistler 3-axis force dynamometer. The Smart Tool is capable of measuring static loads with a total measurement uncertainty of less than 3 percent full scale, but has a natural frequency of approximately 630 Hz. For this reason, signal conditioning of the strain signal is required when vibrations are large. Several techniques in signal processing are investigated to show that the sensor is useful for force estimation, chatter prediction, force model calibration, and dynamic parameter identification. The presented techniques include a discussion of the Kalman filter and Weiner filter for signal enhancement, Linear Predictive Coding for system identification, model-based filtering for force estimation, and sub-optimal linear filters for removing forced vibrations

    caCORRECT2: Improving the accuracy and reliability of microarray data in the presence of artifacts

    Get PDF
    Š 2011 Moffitt et al.; licensee BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.DOI: 10.1186/1471-2105-12-383Background. In previous work, we reported the development of caCORRECT, a novel microarray quality control system built to identify and correct spatial artifacts commonly found on Affymetrix arrays. We have made recent improvements to caCORRECT, including the development of a model-based data-replacement strategy and integration with typical microarray workflows via caCORRECT's web portal and caBIG grid services. In this report, we demonstrate that caCORRECT improves the reproducibility and reliability of experimental results across several common Affymetrix microarray platforms. caCORRECT represents an advance over state-of-art quality control methods such as Harshlighting, and acts to improve gene expression calculation techniques such as PLIER, RMA and MAS5.0, because it incorporates spatial information into outlier detection as well as outlier information into probe normalization. The ability of caCORRECT to recover accurate gene expressions from low quality probe intensity data is assessed using a combination of real and synthetic artifacts with PCR follow-up confirmation and the affycomp spike in data. The caCORRECT tool can be accessed at the website: http://cacorrect.bme.gatech.edu webcite. Results. We demonstrate that (1) caCORRECT's artifact-aware normalization avoids the undesirable global data warping that happens when any damaged chips are processed without caCORRECT; (2) When used upstream of RMA, PLIER, or MAS5.0, the data imputation of caCORRECT generally improves the accuracy of microarray gene expression in the presence of artifacts more than using Harshlighting or not using any quality control; (3) Biomarkers selected from artifactual microarray data which have undergone the quality control procedures of caCORRECT are more likely to be reliable, as shown by both spike in and PCR validation experiments. Finally, we present a case study of the use of caCORRECT to reliably identify biomarkers for renal cell carcinoma, yielding two diagnostic biomarkers with potential clinical utility, PRKAB1 and NNMT. Conclusions. caCORRECT is shown to improve the accuracy of gene expression, and the reproducibility of experimental results in clinical application. This study suggests that caCORRECT will be useful to clean up possible artifacts in new as well as archived microarray data

    Survey Expectations

    Get PDF
    This paper focuses on survey expectations and discusses their uses for testing and modeling of expectations.Alternative models of expectations formation are reviewed and the importance of allowing for heterogeneity of expectations is emphasized. A weak form of the rational expectations hypothesis which focuses on average expectationsrather than individual expectations is advanced. Other models of expectations formation, such as the adaptive expectations hypothesis, are briefly discussed. Testable implications of rational and extrapolative models of expectationsare reviewed and the importance of the loss function for the interpretation of the test results is discussed. The paper thenprovides an account of the various surveys of expectations, reviews alternative methods of quantifying the qualitative surveys, and discusses the use of aggregate and individual survey responses in the analysis of expectations and for forecasting

    Custom Integrated Circuits

    Get PDF
    Contains reports on twelve research projects.Analog Devices, Inc.International Business Machines, Inc.Joint Services Electronics Program (Contract DAAL03-86-K-0002)Joint Services Electronics Program (Contract DAAL03-89-C-0001)U.S. Air Force - Office of Scientific Research (Grant AFOSR 86-0164)Rockwell International CorporationOKI Semiconductor, Inc.U.S. Navy - Office of Naval Research (Contract N00014-81-K-0742)Charles Stark Draper LaboratoryNational Science Foundation (Grant MIP 84-07285)National Science Foundation (Grant MIP 87-14969)Battelle LaboratoriesNational Science Foundation (Grant MIP 88-14612)DuPont CorporationDefense Advanced Research Projects Agency/U.S. Navy - Office of Naval Research (Contract N00014-87-K-0825)American Telephone and TelegraphDigital Equipment CorporationNational Science Foundation (Grant MIP-88-58764
    • …
    corecore