10,531 research outputs found

    Implementing lifetime performance index of products from type-II right - censored data using Lomax distribution

    Get PDF
    Process capability analysis has been widely applied in the manufacturing industry to monitor the performance of industrial processes. The lifetime performance index CL is used to assess the performance and potential of their process, where L is the lower specification limit. In the case of product processing a two-parameter Lomax distribution, the study will apply the transformation technology to construct a maximum likelihood estimator (MLE) of CL based on type-II right-censored data. Then, the MLE of the CL is utilized to develop the new hypothesis testing procedure in the condition known as lower specification limit. Finally, we give an example and the Monte Carlo simulation to assess the behavior of the proposed method under the given significance level

    A Multiple Dependent State Repetitive Sampling Plan Based on Performance Index for Lifetime Data with Type II Censoring

    Get PDF
    In this paper, a multiple dependent state repetitive (MDSR) sampling plan based on the lifetime performance index C-L is proposed for lifetime data with type II censoring when the lifetime of a product follows the exponential distribution or Weibull distribution. The optimal parameters of the proposed plan are determined by minimizing the average sample number while satisfying the producer's risk and consumer's risk at corresponding quality levels. Besides, the performance of the proposed plan is compared with that of the existing lifetime sampling plan in terms of the average sample number and operating characteristic curve. Two illustrative examples are given for the demonstration of the proposed plan.11Ysciescopu

    A family of group chain acceptance sampling plans based on truncated life test

    Get PDF
    Acceptance sampling is a statistical quality control procedure used to accept or reject a lot, based on the inspection result of its sample. For high quality products, zero acceptance number is considered and the life test is often terminated on a specific time, hence called truncated life test. A plan having zero acceptance number is deemed unfair to producers as the probability of lot acceptance drops drastically at a very small proportion defective. To overcome this problem, chain sampling which uses preceding and succeeding lots information was introduced. In ordinary chain sampling plans, only one product is inspected at a time, although in practice, testers can accommodate multiple products simultaneously. In this situation, group chain sampling plan with small sample size is preferred because it saves inspection time and cost. Thus, it is worthwhile to develop the various types of chain sampling plans in the context of group testing. This research aims to develop new group chain (GChSP), modified group chain (MGChSP), two-sided group chain (TS-GChSP) and modified two-sided group chain (TS-MGChSP) sampling plans using the Pareto distribution of the 2nd kind. These four plans are also generalized based on several pre-specified values of proportion defective. This study involves four phases: identifying several combinations of design parameters; developing the procedures; obtaining operating characteristic functions; and measuring performances using both simulated and real lifetime data. The constructed plans are evaluated using various design parameters and compared with the established plan based on the number of minimum groups, and probability of lot acceptance,. The findings show that all the proposed plans provide smaller and lower compared to the established plan. All the plans are able to reduce inspection time and cost, and better at protecting customers from receiving defective products. This would be very beneficial to practitioners especially those involved with destructive testing of high quality products

    Order-statistics-based inferences for censored lifetime data and financial risk analysis

    Get PDF
    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.This thesis focuses on applying order-statistics-based inferences on lifetime analysis and financial risk measurement. The first problem is raised from fitting the Weibull distribution to progressively censored and accelerated life-test data. A new orderstatistics- based inference is proposed for both parameter and con dence interval estimation. The second problem can be summarised as adopting the inference used in the first problem for fitting the generalised Pareto distribution, especially when sample size is small. With some modifications, the proposed inference is compared with classical methods and several relatively new methods emerged from recent literature. The third problem studies a distribution free approach for forecasting financial volatility, which is essentially the standard deviation of financial returns. Classical models of this approach use the interval between two symmetric extreme quantiles of the return distribution as a proxy of volatility. Two new models are proposed, which use intervals of expected shortfalls and expectiles, instead of interval of quantiles. Different models are compared with empirical stock indices data. Finally, attentions are drawn towards the heteroskedasticity quantile regression. The proposed joint modelling approach, which makes use of the parametric link between the quantile regression and the asymmetric Laplace distribution, can provide estimations of the regression quantile and of the log linear heteroskedastic scale simultaneously. Furthermore, the use of the expectation of the check function as a measure of quantile deviation is discussed

    VI Workshop on Computational Data Analysis and Numerical Methods: Book of Abstracts

    Get PDF
    The VI Workshop on Computational Data Analysis and Numerical Methods (WCDANM) is going to be held on June 27-29, 2019, in the Department of Mathematics of the University of Beira Interior (UBI), Covilhã, Portugal and it is a unique opportunity to disseminate scientific research related to the areas of Mathematics in general, with particular relevance to the areas of Computational Data Analysis and Numerical Methods in theoretical and/or practical field, using new techniques, giving especial emphasis to applications in Medicine, Biology, Biotechnology, Engineering, Industry, Environmental Sciences, Finance, Insurance, Management and Administration. The meeting will provide a forum for discussion and debate of ideas with interest to the scientific community in general. With this meeting new scientific collaborations among colleagues, namely new collaborations in Masters and PhD projects are expected. The event is open to the entire scientific community (with or without communication/poster)

    Volatility forecasting

    Get PDF
    Volatility has been one of the most active and successful areas of research in time series econometrics and economic forecasting in recent decades. This chapter provides a selective survey of the most important theoretical developments and empirical insights to emerge from this burgeoning literature, with a distinct focus on forecasting applications. Volatility is inherently latent, and Section 1 begins with a brief intuitive account of various key volatility concepts. Section 2 then discusses a series of different economic situations in which volatility plays a crucial role, ranging from the use of volatility forecasts in portfolio allocation to density forecasting in risk management. Sections 3, 4 and 5 present a variety of alternative procedures for univariate volatility modeling and forecasting based on the GARCH, stochastic volatility and realized volatility paradigms, respectively. Section 6 extends the discussion to the multivariate problem of forecasting conditional covariances and correlations, and Section 7 discusses volatility forecast evaluation methods in both univariate and multivariate cases. Section 8 concludes briefly. JEL Klassifikation: C10, C53, G1

    Data Analysis and Experimental Design for Accelerated Life Testing with Heterogeneous Group Effects

    Get PDF
    abstract: In accelerated life tests (ALTs), complete randomization is hardly achievable because of economic and engineering constraints. Typical experimental protocols such as subsampling or random blocks in ALTs result in a grouped structure, which leads to correlated lifetime observations. In this dissertation, generalized linear mixed model (GLMM) approach is proposed to analyze ALT data and find the optimal ALT design with the consideration of heterogeneous group effects. Two types of ALTs are demonstrated for data analysis. First, constant-stress ALT (CSALT) data with Weibull failure time distribution is modeled by GLMM. The marginal likelihood of observations is approximated by the quadrature rule; and the maximum likelihood (ML) estimation method is applied in iterative fashion to estimate unknown parameters including the variance component of random effect. Secondly, step-stress ALT (SSALT) data with random group effects is analyzed in similar manner but with an assumption of exponentially distributed failure time in each stress step. Two parameter estimation methods, from the frequentist’s and Bayesian points of view, are applied; and they are compared with other traditional models through simulation study and real example of the heterogeneous SSALT data. The proposed random effect model shows superiority in terms of reducing bias and variance in the estimation of life-stress relationship. The GLMM approach is particularly useful for the optimal experimental design of ALT while taking the random group effects into account. In specific, planning ALTs under nested design structure with random test chamber effects are studied. A greedy two-phased approach shows that different test chamber assignments to stress conditions substantially impact on the estimation of unknown parameters. Then, the D-optimal test plan with two test chambers is constructed by applying the quasi-likelihood approach. Lastly, the optimal ALT planning is expanded for the case of multiple sources of random effects so that the crossed design structure is also considered, along with the nested structure.Dissertation/ThesisDoctoral Dissertation Industrial Engineering 201
    corecore