80 research outputs found
The evaluation on the process capability index CL for exponentiated Frech’et lifetime product under progressive type I interval censoring
[[abstract]]We present the likelihood inferences on the lifetime performance index CL to evaluate the performance of lifetimes of products following the skewed Exponentiated Frech’et distribution in many manufacturing industries. This research is related to the topic of skewed Probability Distributions and Applications across Disciplines. Exponentiated Frech’et distribution is a generalization of some lifetime distributions. The maximum likelihood estimator for CL for lifetimes with exponentiated Frech’et distribution is derived to develop a computational testing procedure so that experimenters can implement it to test whether the lifetime performance reached the pre-assigned level of significance with a given lower specification limit under progressive type I interval censoring. At the end, two examples are provided to demonstrate the implementation on the algorithm for our proposed computational testing procedure.[[notice]]補æ£å®Œ
A Theoretical Foundation for the Development of Process Capability Indices and Process Parameters Optimization under Truncated and Censoring Schemes
Process capability indices (PCIs) provide a measure of the output of an in-control process that conforms to a set of specification limits. These measures, which assume that process output is approximately normally distributed, are intended for measuring process capability for manufacturing systems. After implementing inspections, however, non-conforming products are typically scrapped when units fail to meet the specification limits; hence, after inspections, the actual resulting distribution of shipped products that customers perceive is truncated. In this research, a set of customer-perceived PCIs is developed focused on the truncated normal distribution, as an extension of traditional manufacturer-based indices. Comparative studies and numerical examples reveal considerable differences among the traditional PCIs and the proposed PCIs. The comparison results suggest using the proposed PCIs for capability analyses when non-conforming products are scrapped prior to shipping to customers. The confidence interval approximations for the proposed PCIs are also developed. A simulation technique is implemented to compare the proposed PCIs with its traditional counterparts across multiple performance scenarios. The robust parameter design (RPD), as a systematic method for determining the optimum operating conditions that achieve the quality improvement goals, is also studied within the realm of censored data. Data censoring occurs in time-oriented observations when some data is unmeasurable outside a predetermined study period. The underlying conceptual basis of the current RPD studies is the random sampling from a normal distribution, assuming that all the data points are uncensored. However, censoring schemes are widely implemented in lifetime testing, survival analysis, and reliability studies. As such, this study develops the detailed guidelines for a new RPD method with the consideration of type I-right censoring concepts. The response functions are developed using nonparametric methods, including the Kaplan-Meier estimator, Greenwood\u27s formula, and the Cox proportional hazards regression method. Various response-surface-based robust parameter design optimization models are proposed and are demonstrated through a numerical example. Further, the process capability index for type I-right censored data using the nonparametric methods is also developed for assessing the performance of a product based on its lifetime
Recommended from our members
Statistical Emulation for Environmental Sustainability Analysis
The potential effects of climate change on the environment and society are many. In order to effectively quantify the uncertainty associated with these effects, highly complex simulation models are run with detailed representations of ecosystem processes. These models are computationally expensive and can involve computer runs of several days for their outputs. Computationally cheaper models can be obtained from large ensembles of simulations using a statistical emulation.
The purpose of this thesis is to construct cheaper computational models (emulators) from simulation outputs of Lund-Potsdam-Jena-managed Land (LPJmL) which is a dynamic global vegetation and crop model. This research work is part of a project called ERMITAGE. The project links together several key component models into a common framework to better understand how the management and interaction of land, water and the earth’s climate system could be improved.
The thesis focuses specifically on emulation of major outputs from the LPJmL model; carbon fluxes (NPP, carbon loss due to heterotrophic respiration and fire carbon) and potential crop yields (cereal, rice, maize and oil crops). Future decadal changes in carbon fluxes and crop yields are modelled as linear functions of climate change and other relevant variables. The emulators are constructed using a combination of statistical techniques of stepwise least squares regression, principal component analysis, weighted least squares regression, censored regression and Gaussian process regression.
Further modelling involves sensitivity analyses to identify the relative contribution of each input variable to the total output variance. This used the Sobol global sensitivity method. The data cover the period 2001-2100 and comprise climate scenarios of several GCMs and RCPs. Under cross validation the percentage of variance explained ranges from 52-96% for carbon fluxes, 60-88% for the rainfed crops and 62-93% for the irrigated crops, averaged over climate scenarios
- …