969 research outputs found

    Measuring process capability for bivariate non-normal process using the bivariate burr distribution

    Get PDF
    As is well known, process capability analysis for more than one quality variables is a complicated and sometimes contentious area with several quality measures vying for recognition. When these variables exhibit non-normal characteristics, the situation becomes even more complex. The aim of this paper is to measure Process Capability Indices (PCIs) for bivariate non-normal process using the bivariate Burr distribution. The univariate Burr distribution has been shown to improve the accuracy of estimates of PCIs for univariate non-normal distributions (see for example, [7] and [16]). Here, we will estimate the PCIs of bivariate non-normal distributions using the bivariate Burr distribution. The process of obtaining these PCIs will be accomplished in a series of steps involving estimating the unknown parameters of the process using maximum likelihood estimation coupled with simulated annealing. Finally, the Proportion of Non-Conformance (PNC) obtained using this method will be compared with those obtained from variables distributed under the bivariate Beta, Weibull, Gamma and Weibull-Gamma distributions

    Capability Indices for Non-Normal Distribution using Gini’s Mean Difference as Measure of Variability

    Get PDF
    This paper investigates the efficiency of Gini's mean difference (GMD) as a measure of variability in two commonly used process capability indices (PCIs), i.e., Cp and Cpk. A comparison has been carried out to evaluate the performance of GMD-based PCIs and Pearn and Chen quantile-based PCIs under low, moderate, and high asymmetry using Weibull distribution. The simulation results, under low and moderate asymmetric condition, indicate that GMD-based PCIs are more close to target values than quantile approach. Beside point estimation, nonparametric bootstrap confidence intervals, such as standard, percentile, and bias corrected percentile with their coverage probabilities also have been calculated. Using quantile approach, bias corrected percentile (BCPB) method is more effective for both Cp and Cpk, where as in case of GMD, both BCPB and percentile bootstrap method can be used to estimate the confidence interval of Cp and Cpk, respectively.1133Ysciescopu

    Forecasting Long-Term Government Bond Yields: An Application of Statistical and AI Models

    Get PDF
    This paper evaluates several artificial intelligence and classical algorithms on their ability of forecasting the monthly yield of the US 10-year Treasury bonds from a set of four economic indicators. Due to the complexity of the prediction problem, the task represents a challenging test for the algorithms under evaluation. At the same time, the study is of particular significance for the important and paradigmatic role played by the US market in the world economy. Four data-driven artificial intelligence approaches are considered, namely, a manually built fuzzy logic model, a machine learned fuzzy logic model, a self-organising map model and a multi-layer perceptron model. Their performance is compared with the performance of two classical approaches, namely, a statistical ARIMA model and an econometric error correction model. The algorithms are evaluated on a complete series of end-month US 10-year Treasury bonds yields and economic indicators from 1986:1 to 2004:12. In terms of prediction accuracy and reliability of the modelling procedure, the best results are obtained by the three parametric regression algorithms, namely the econometric, the statistical and the multi-layer perceptron model. Due to the sparseness of the learning data samples, the manual and the automatic fuzzy logic approaches fail to follow with adequate precision the range of variations of the US 10-year Treasury bonds. For similar reasons, the self-organising map model gives an unsatisfactory performance. Analysis of the results indicates that the econometric model has a slight edge over the statistical and the multi-layer perceptron models. This suggests that pure data-driven induction may not fully capture the complicated mechanisms ruling the changes in interest rates. Overall, the prediction accuracy of the best models is only marginally better than the prediction accuracy of a basic one-step lag predictor. This result highlights the difficulty of the modelling task and, in general, the difficulty of building reliable predictors for financial markets.interest rates; forecasting; neural networks; fuzzy logic.

    Process capability index Cpk for monitoring the thermal performance in the distribution of refrigerated products

    Get PDF
    The temperature of refrigerated products along the cold chain must be kept within pre-defined limits to ensure adequate safety levels and high product quality. Because temperature largely influences microbial activities, the continuous monitoring of the time-temperature history over the distribution process usually allows for the adequate control of the product quality along both short- and medium-distance distribution routes. Time-Temperature Indicators (TTI) are composed of temperature measurements taken at various time intervals and are used to feed analytic models that monitor the impacts of temperature on product quality. Process Capability Indices (PCI), however, are calculated using TTI series to evaluate whether the thermal characteristics of the process are within the specified range. In this application, a refrigerated food delivery route is investigated using a simulated annealing algorithm that considers alternative delivery schemes. The objective of this investigation is to minimize the distance traveled while maintaining the vehicle temperature within the prescribed capability level261546

    Disaster management in industrial areas: perspectives, challenges and future research

    Get PDF
    Purpose: In most countries, development, growth, and sustenance of industrial facilities are given utmost importance due to the influence in the socio-economic development of the country. Therefore, special economic zones, or industrial areas or industrial cities are developed in order to provide the required services for the sustained operation of such facilities. Such facilities not only provide a prolonged economic support to the country but it also helps in the societal aspects as well by providing livelihood to thousands of people. Therefore, any disaster in any of the facilities in the industrial area will have a significant impact on the population, facilities, the economy, and threatens the sustainability of the operations. This paper provides review of such literature that focus on theory and practice of disaster management in industrial cities. Design/methodology/approach: In the paper, content analysis method is used in order to elicit the insights of the literature available. The methodology uses search methods, literature segregation and developing the current knowledge on different phases of industrial disaster management. Findings: It is found that the research is done in all phases of disaster management, namely, preventive phase, reactive phase and corrective phase. The research in each of these areas are focused on four main aspects, which are facilities, resources, support systems and modeling. Nevertheless, the research in the industrial cities is insignificant. Moreover, the modeling part does not explicitly consider the nature of industrial cities, where many of the chemical and chemical processing can be highly flammable thus creating a very large disaster impact. Some research is focused at an individual plant and scaled up to the industrial cities. The modeling part is weak in terms of comprehensively analyzing and assisting disaster management in the industrial cities. Originality/value: The comprehensive review using content analysis on disaster management is presented here. The review helps the researchers to understand the gap in the literature in order to extend further research for disaster management in large scale industrial cities.Peer Reviewe

    Disaster management in industrial areas: Perspectives, challenges and future research

    Get PDF
    Purpose: In most countries, development, growth, and sustenance of industrial facilities are given utmost importance due to the influence in the socio-economic development of the country. Therefore, special economic zones, or industrial areas or industrial cities are developed in order to provide the required services for the sustained operation of such facilities. Such facilities not only provide a prolonged economic support to the country but it also helps in the societal aspects as well by providing livelihood to thousands of people. Therefore, any disaster in any of the facilities in the industrial area will have a significant impact on the population, facilities, the economy, and threatens the sustainability of the operations. This paper provides review of such literature that focus on theory and practice of disaster management in industrial cities. Design/methodology/approach: In the paper, content analysis method is used in order to elicit the insights of the literature available. The methodology uses search methods, literature segregation and developing the current knowledge on different phases of industrial disaster management. Findings: It is found that the research is done in all phases of disaster management, namely, preventive phase, reactive phase and corrective phase. The research in each of these areas are focused on four main aspects, which are facilities, resources, support systems and modeling. Nevertheless, the research in the industrial cities is insignificant. Moreover, the modeling part does not explicitly consider the nature of industrial cities, where many of the chemical and chemical processing can be highly flammable thus creating a very large disaster impact. Some research is focused at an individual plant and scaled up to the industrial cities. The modeling part is weak in terms of comprehensively analyzing and assisting disaster management in the industrial cities. Originality/value: The comprehensive review using content analysis on disaster management is presented here. The review helps the researchers to understand the gap in the literature in order to extend further research for disaster management in large scale industrial cities.Scopu

    An Analytic Hierarchy Process approach to assess health service quality

    Get PDF
    While improving quality in health care is currently at the forefront of professional, political, and managerial attention, the key dimensions constituting health-care quality have not been fully understood. Also, few valid approaches have been proposed to the measurement of health-care quality. In this research, the Analytic Hierarchy Process (AHP) approach is applied to study the structure of health-care quality and deducted relative importance weights for each of the quality elements. A statistical quality model is derived to assess medical equipment quality which is an important part constituting the general health-care quality. Finally, the application of the AHP model to assess health-care quality is demonstrated based on a scenario

    Forecasting long-term government bond yields: an application of statistical and ai models

    Get PDF
    This paper evaluates several artificial intelligence and classical algorithms on their ability of forecasting the monthly yield of the US 10-year Treasury bonds from a set of four economic indicators. Due to the complexity of the prediction problem, the task represents a challenging test for the algorithms under evaluation. At the same time, the study is of particular significance for the important and paradigmatic role played by the US market in the world economy. Four data-driven artificial intelligence approaches are considered, namely, a manually built fuzzy logic model, a machine learned fuzzy logic model, a self-organising map model and a multi-layer perceptron model. Their performance is compared with the performance of two classical approaches, namely, a statistical ARIMA model and an econometric error correction model. The algorithms are evaluated on a complete series of end-month US 10-year Treasury bonds yields and economic indicators from 1986:1 to 2004:12. In terms of prediction accuracy and reliability of the modelling procedure, the best results are obtained by the three parametric regression algorithms, namely the econometric, the statistical and the multi-layer perceptron model. Due to the sparseness of the learning data samples, the manual and the automatic fuzzy logic approaches fail to follow with adequate precision the range of variations of the US 10-year Treasury bonds. For similar reasons, the self-organising map model gives an unsatisfactory performance. Analysis of the results indicates that the econometric model has a slight edge over the statistical and the multi-layer perceptron models. This suggests that pure data-driven induction may not fully capture the complicated mechanisms ruling the changes in interest rates. Overall, the prediction accuracy of the best models is only marginally better than the prediction accuracy of a basic one-step lag predictor. This result highlights the difficulty of the modelling task and, in general, the difficulty of building reliable predictors for financial markets
    corecore