14,672 research outputs found

    Warranty Data Analysis: A Review

    Get PDF
    Warranty claims and supplementary data contain useful information about product quality and reliability. Analysing such data can therefore be of benefit to manufacturers in identifying early warnings of abnormalities in their products, providing useful information about failure modes to aid design modification, estimating product reliability for deciding on warranty policy and forecasting future warranty claims needed for preparing fiscal plans. In the last two decades, considerable research has been conducted in warranty data analysis (WDA) from several different perspectives. This article attempts to summarise and review the research and developments in WDA with emphasis on models, methods and applications. It concludes with a brief discussion on current practices and possible future trends in WDA

    Sliced rotated sphere packing designs

    Full text link
    Space-filling designs are popular choices for computer experiments. A sliced design is a design that can be partitioned into several subdesigns. We propose a new type of sliced space-filling design called sliced rotated sphere packing designs. Their full designs and subdesigns are rotated sphere packing designs. They are constructed by rescaling, rotating, translating and extracting the points from a sliced lattice. We provide two fast algorithms to generate such designs. Furthermore, we propose a strategy to use sliced rotated sphere packing designs adaptively. Under this strategy, initial runs are uniformly distributed in the design space, follow-up runs are added by incorporating information gained from initial runs, and the combined design is space-filling for any local region. Examples are given to illustrate its potential application

    Technology and Technometrics approachesne

    Get PDF
    The technological innovation, nowadays, is one of the most important determinant for increasing the wealth of the nations. Souder and Shrivastrava said “we can’t begin to make decisions about technology until we understand it. And we can’t begin to really understand it until we can measure it”. For this reason within the economics a new branch called Technometrics is born: it is a new theoretical framework for the conception and measurement of technological change with important policy implications (Sahal, 1985). The aim of this paper is, after introducing the concepts of technological innovation as used by the economists during the nineteenth and twentieth-century, to show the historical evolution of the several approaches used to measure and evaluate the technology and technology change from 1930 to 2004. A discussion of these approaches shows the methodological difficulties and their potentials.Technometrics, Technology, Technological Change, Patterns of technological innovation,history of economic thought, systemic approach, innovation diffusion

    Rotated sphere packing designs

    Full text link
    We propose a new class of space-filling designs called rotated sphere packing designs for computer experiments. The approach starts from the asymptotically optimal positioning of identical balls that covers the unit cube. Properly scaled, rotated, translated and extracted, such designs are excellent in maximin distance criterion, low in discrepancy, good in projective uniformity and thus useful in both prediction and numerical integration purposes. We provide a fast algorithm to construct such designs for any numbers of dimensions and points with R codes available online. Theoretical and numerical results are also provided

    Quantifying uncertainties on excursion sets under a Gaussian random field prior

    Get PDF
    We focus on the problem of estimating and quantifying uncertainties on the excursion set of a function under a limited evaluation budget. We adopt a Bayesian approach where the objective function is assumed to be a realization of a Gaussian random field. In this setting, the posterior distribution on the objective function gives rise to a posterior distribution on excursion sets. Several approaches exist to summarize the distribution of such sets based on random closed set theory. While the recently proposed Vorob'ev approach exploits analytical formulae, further notions of variability require Monte Carlo estimators relying on Gaussian random field conditional simulations. In the present work we propose a method to choose Monte Carlo simulation points and obtain quasi-realizations of the conditional field at fine designs through affine predictors. The points are chosen optimally in the sense that they minimize the posterior expected distance in measure between the excursion set and its reconstruction. The proposed method reduces the computational costs due to Monte Carlo simulations and enables the computation of quasi-realizations on fine designs in large dimensions. We apply this reconstruction approach to obtain realizations of an excursion set on a fine grid which allow us to give a new measure of uncertainty based on the distance transform of the excursion set. Finally we present a safety engineering test case where the simulation method is employed to compute a Monte Carlo estimate of a contour line

    Predicted Residual Error Sum of Squares of Mixed Models: An Application for Genomic Prediction.

    Get PDF
    Genomic prediction is a statistical method to predict phenotypes of polygenic traits using high-throughput genomic data. Most diseases and behaviors in humans and animals are polygenic traits. The majority of agronomic traits in crops are also polygenic. Accurate prediction of these traits can help medical professionals diagnose acute diseases and breeders to increase food products, and therefore significantly contribute to human health and global food security. The best linear unbiased prediction (BLUP) is an important tool to analyze high-throughput genomic data for prediction. However, to judge the efficacy of the BLUP model with a particular set of predictors for a given trait, one has to provide an unbiased mechanism to evaluate the predictability. Cross-validation (CV) is an essential tool to achieve this goal, where a sample is partitioned into K parts of roughly equal size, one part is predicted using parameters estimated from the remaining K - 1 parts, and eventually every part is predicted using a sample excluding that part. Such a CV is called the K-fold CV. Unfortunately, CV presents a substantial increase in computational burden. We developed an alternative method, the HAT method, to replace CV. The new method corrects the estimated residual errors from the whole sample analysis using the leverage values of a hat matrix of the random effects to achieve the predicted residual errors. Properties of the HAT method were investigated using seven agronomic and 1000 metabolomic traits of an inbred rice population. Results showed that the HAT method is a very good approximation of the CV method. The method was also applied to 10 traits in 1495 hybrid rice with 1.6 million SNPs, and to human height of 6161 subjects with roughly 0.5 million SNPs of the Framingham heart study data. Predictabilities of the HAT and CV methods were all similar. The HAT method allows us to easily evaluate the predictabilities of genomic prediction for large numbers of traits in very large populations

    Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    Get PDF
    The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program
    • …
    corecore