11,465 research outputs found

    Cumulative sum quality control charts design and applications

    Get PDF
    Includes bibliographical references (pages 165-169).Classical Statistical Process Control Charts are essential in Statistical Control exercises and thus constantly obtained attention for quality improvements. However, the establishment of control charts requires large-sample data (say, no less than I 000 data points). On the other hand, we notice that the small-sample based Grey System Theory Approach is well-established and applied in many areas: social, economic, industrial, military and scientific research fields. In this research, the short time trend curve in terms of GM( I, I) model will be merged into Shewhart and CU SUM two-sided version control charts and establish Grey Predictive Shewhart Control chart and Grey Predictive CUSUM control chart. On the other hand the GM(2, I) model is briefly checked its of how accurate it could be as compared to GM( I, 1) model in control charts. Industrial process data collected from TBF Packaging Machine Company in Taiwan was analyzed in terms of these new developments as an illustrative example for grey quality control charts

    A comparison study of distribution-free multivariate SPC methods for multimode data

    Get PDF
    The data-rich environments of industrial applications lead to large amounts of correlated quality characteristics that are monitored using Multivariate Statistical Process Control (MSPC) tools. These variables usually represent heterogeneous quantities that originate from one or multiple sensors and are acquired with different sampling parameters. In this framework, any assumptions relative to the underlying statistical distribution may not be appropriate, and conventional MSPC methods may deliver unacceptable performances. In addition, in many practical applications, the process switches from one operating mode to a different one, leading to a stream of multimode data. Various nonparametric approaches have been proposed for the design of multivariate control charts, but the monitoring of multimode processes remains a challenge for most of them. In this study, we investigate the use of distribution-free MSPC methods based on statistical learning tools. In this work, we compared the kernel distance-based control chart (K-chart) based on a one-class-classification variant of support vector machines and a fuzzy neural network method based on the adaptive resonance theory. The performances of the two methods were evaluated using both Monte Carlo simulations and real industrial data. The simulated scenarios include different types of out-of-control conditions to highlight the advantages and disadvantages of the two methods. Real data acquired during a roll grinding process provide a framework for the assessment of the practical applicability of these methods in multimode industrial applications

    Data-driven Soft Sensors in the Process Industry

    Get PDF
    In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work

    Cloud cover determination in polar regions from satellite imagery

    Get PDF
    A definition is undertaken of the spectral and spatial characteristics of clouds and surface conditions in the polar regions, and to the creation of calibrated, geometrically correct data sets suitable for quantitative analysis. Ways are explored in which this information can be applied to cloud classifications as new methods or as extensions to existing classification schemes. A methodology is developed that uses automated techniques to merge Advanced Very High Resolution Radiometer (AVHRR) and Scanning Multichannel Microwave Radiometer (SMMR) data, and to apply first-order calibration and zenith angle corrections to the AVHRR imagery. Cloud cover and surface types are manually interpreted, and manual methods are used to define relatively pure training areas to describe the textural and multispectral characteristics of clouds over several surface conditions. The effects of viewing angle and bidirectional reflectance differences are studied for several classes, and the effectiveness of some key components of existing classification schemes is tested

    Multivariate Control Chart based on Neutrosophic Hotelling T2 Statistics and Its Application

    Get PDF
    Under classical statistics Hotelling 〖 T〗^2 control chart is applied when the observations of quality characteristics are precise, exact, or crips data. However, in reality, under uncertain conditions, the observations are not necessarily precise, exact, or indeterminacy. As a consequence, the classical Hotelling〖 T〗^2control chart is not appropriate to monitor the process for this condition. To tackle this situation, we proposed new Hotelling 〖 T〗^2 monitoring scheme based on a fuzzy neutrosophic concept. Neutrosophic is the generalization of fuzzy. It is used to handle uncertainty using indeterminacy. The combination of statistics based on neutrosophic Hotelling 〖 T〗^2 and classical Hotelling 〖 T〗^2 control chart will be proposed to tackle indeterminacy observations. The proposed Hotelling 〖 T〗^2 statistics, its call neutrosophic Hotelling 〖 T〗^2 (T_N^2 ) control chart. This chart involves the indeterminacy of observations, its call neutrosophic data and will be expressed in the indeterminacy interval. T_N^2 control charts consist T_N^2 lower chart and T_N^2 upper chart. In this paper, the neutrosophic Hotelling T^2will be applied to individual observations of glass production and will be compared by using classical Hotelling T^2 control chart. Based on T_N^2 control charts of glass production, nine points fall outside of 〖UCL〗_N of lower control chart and 24 points outside from 〖UCL〗_N  of upper control chart. Whereas using classical Hotelling T^2 control chart, just one point outside frim UCL. From the comparison, it concluded that the neutrosophic Hotelling T^2 control chart is more suitable for the indeterminacy of observations

    Fuzzy x

    Get PDF

    Digital Image-Based Frameworks for Monitoring and Controlling of Particulate Systems

    Get PDF
    Particulate processes have been widely involved in various industries and most products in the chemical industry today are manufactured as particulates. Previous research and practise illustrate that the final product quality can be influenced by particle properties such as size and shape which are related to operating conditions. Online characterization of these particles is an important step for maintaining desired product quality in particulate processes. Image-based characterization method for the purpose of monitoring and control particulate processes is very promising and attractive. The development of a digital image-based framework, in the context of this research, can be envisioned in two parts. One is performing image analysis and designing advanced algorithms for segmentation and texture analysis. The other is formulating and implementing modern predictive tools to establish the correlations between the texture features and the particle characteristics. According to the extent of touching and overlapping between particles in images, two image analysis methods were developed and tested. For slight touching problems, image segmentation algorithms were developed by introducing Wavelet Transform de-noising and Fuzzy C-means Clustering detecting the touching regions, and by adopting the intensity and geometry characteristics of touching areas. Since individual particles can be identified through image segmentation, particle number, particle equivalent diameter, and size distribution were used as the features. For severe touching and overlapping problems, texture analysis was carried out through the estimation of wavelet energy signature and fractal dimension based on wavelet decomposition on the objects. Predictive models for monitoring and control for particulate processes were formulated and implemented. Building on the feature extraction properties of the wavelet decomposition, a projection technique such as principal component analysis (PCA) was used to detect off-specification conditions which generate particle mean size deviates the target value. Furthermore, linear and nonlinear predictive models based on partial least squares (PLS) and artificial neural networks (ANN) were formulated, implemented and tested on an experimental facility to predict particle characteristics (mean size and standard deviation) from the image texture analysis

    Control, optimization and monitoring of Portland cement (Pc 42.5) quality at the ball mill

    Get PDF
    Thesis (Master)--Izmir Institute of Technology, Chemical Engineering, Izmir, 2006Includes bibliographical references (leaves: 77-78)Text in English; Abstract: Turkish and Englishxi, 89 leavesIn this study, artificial neural networks (ANN) and fuzzy logic models were developed to model relationship among cement mill operational parameters. The response variable was weight percentage of product residue on 32-micrometer sieve (or fineness), while the input parameters were revolution percent, falofon percentage, and the elevator amperage (amps), which exhibits elevator charge to the separator. The process data collected from a local plant, Cimenta Cement Factory, in 2004, were used in model construction and testing. First, ANN (Artificial Neural Network) model was constructed. A feed forward network type with one input layer including 3 input parameters, two hidden layer, and one output layer including residue percentage on 32 micrometer sieve as an output parameter was constructed. After testing the model, it was detected that the model.s ability to predict the residue on 32-micrometer sieve (fineness) was successful (Correlation coefficient is 0.92). By detailed analysis of values of parameters of ANN model.s contour plots, Mamdani type fuzzy rule set in the fuzzy model on MatLAB was created. There were three parameters and three levels, and then there were third power of three (27) rules. In this study, we constructed mix of Z type, S type and gaussian type membership functions of the input parameters and response. By help of fuzzy toolbox of MatLAB, the residue percentage on 32-micrometer sieve (fineness) was predicted. Finally, It was found that the model had a correlation coefficient of 0.76. The utility of the ANN and fuzzy models created in this study was in the potential ability of the process engineers to control processing parameters to accomplish the desired cement fineness levels. In the second part of the study, a quantitative procedure for monitoring and evaluating cement milling process performance was described. Some control charts such as CUSUM (Cumulative Sum) and EWMA (Exponentially Weighted Moving Average) charts were used to monitor the cement fineness by using historical data. As a result, it is found that CUSUM and EWMA control charts can be easily used in the cement milling process monitoring in order to detect small shifts in 32-micrometer fineness, percentage by weight, in shorter sampling time interval

    Evolution of statistical analysis in empirical software engineering research: Current state and steps forward

    Full text link
    Software engineering research is evolving and papers are increasingly based on empirical data from a multitude of sources, using statistical tests to determine if and to what degree empirical evidence supports their hypotheses. To investigate the practices and trends of statistical analysis in empirical software engineering (ESE), this paper presents a review of a large pool of papers from top-ranked software engineering journals. First, we manually reviewed 161 papers and in the second phase of our method, we conducted a more extensive semi-automatic classification of papers spanning the years 2001--2015 and 5,196 papers. Results from both review steps was used to: i) identify and analyze the predominant practices in ESE (e.g., using t-test or ANOVA), as well as relevant trends in usage of specific statistical methods (e.g., nonparametric tests and effect size measures) and, ii) develop a conceptual model for a statistical analysis workflow with suggestions on how to apply different statistical methods as well as guidelines to avoid pitfalls. Lastly, we confirm existing claims that current ESE practices lack a standard to report practical significance of results. We illustrate how practical significance can be discussed in terms of both the statistical analysis and in the practitioner's context.Comment: journal submission, 34 pages, 8 figure
    corecore