14,012 research outputs found

    Warranty Data Analysis: A Review

    Get PDF
    Warranty claims and supplementary data contain useful information about product quality and reliability. Analysing such data can therefore be of benefit to manufacturers in identifying early warnings of abnormalities in their products, providing useful information about failure modes to aid design modification, estimating product reliability for deciding on warranty policy and forecasting future warranty claims needed for preparing fiscal plans. In the last two decades, considerable research has been conducted in warranty data analysis (WDA) from several different perspectives. This article attempts to summarise and review the research and developments in WDA with emphasis on models, methods and applications. It concludes with a brief discussion on current practices and possible future trends in WDA

    Technology and Technometrics approachesne

    Get PDF
    The technological innovation, nowadays, is one of the most important determinant for increasing the wealth of the nations. Souder and Shrivastrava said “we can’t begin to make decisions about technology until we understand it. And we can’t begin to really understand it until we can measure it”. For this reason within the economics a new branch called Technometrics is born: it is a new theoretical framework for the conception and measurement of technological change with important policy implications (Sahal, 1985). The aim of this paper is, after introducing the concepts of technological innovation as used by the economists during the nineteenth and twentieth-century, to show the historical evolution of the several approaches used to measure and evaluate the technology and technology change from 1930 to 2004. A discussion of these approaches shows the methodological difficulties and their potentials.Technometrics, Technology, Technological Change, Patterns of technological innovation,history of economic thought, systemic approach, innovation diffusion

    Calculation of Weibull strength parameters and Batdorf flow-density constants for volume- and surface-flaw-induced fracture in ceramics

    Get PDF
    The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program

    Minimizing Negative Transfer of Knowledge in Multivariate Gaussian Processes: A Scalable and Regularized Approach

    Full text link
    Recently there has been an increasing interest in the multivariate Gaussian process (MGP) which extends the Gaussian process (GP) to deal with multiple outputs. One approach to construct the MGP and account for non-trivial commonalities amongst outputs employs a convolution process (CP). The CP is based on the idea of sharing latent functions across several convolutions. Despite the elegance of the CP construction, it provides new challenges that need yet to be tackled. First, even with a moderate number of outputs, model building is extremely prohibitive due to the huge increase in computational demands and number of parameters to be estimated. Second, the negative transfer of knowledge may occur when some outputs do not share commonalities. In this paper we address these issues. We propose a regularized pairwise modeling approach for the MGP established using CP. The key feature of our approach is to distribute the estimation of the full multivariate model into a group of bivariate GPs which are individually built. Interestingly pairwise modeling turns out to possess unique characteristics, which allows us to tackle the challenge of negative transfer through penalizing the latent function that facilitates information sharing in each bivariate model. Predictions are then made through combining predictions from the bivariate models within a Bayesian framework. The proposed method has excellent scalability when the number of outputs is large and minimizes the negative transfer of knowledge between uncorrelated outputs. Statistical guarantees for the proposed method are studied and its advantageous features are demonstrated through numerical studies

    Kernelized design of experiments

    Get PDF
    This paper describes an approach for selecting instances in regression problems in the cases where observations x are readily available, but obtaining labels y is hard. Given a database of observations, an algorithm inspired by statistical design of experiments and kernel methods is presented that selects a set of k instances to be chosen in order to maximize the prediction performance of a support vector machine. It is shown that the algorithm significantly outperforms related approaches on a number of real-world datasets. --
    • …
    corecore