46 research outputs found

    Energy management in the formation of light, starter, and ignition lead-acid batteries

    Get PDF
    This paper discusses energy management in the formation process of lead-acid batteries. Battery production and electricity consumption in during battery formation in a battery plant were analyzed over a 4-year period. The main parameters affecting the energy performance of battery production were identified and different actions to improve it were proposed. Furthermore, an Energy Performance Indicator (EnPI), based on the electricity consumption of battery formation (a difficult and rather expensive parameter to measure), is introduced to assess its energy efficiency. Therefore, a Soft Sensor to measure the electricity consumption in real-time (based on the voltage and current measured during battery formation) and to calculate the EnPI is developed. Moreover, Energy Management (EM), aided by the use of energy baselines and control charts is implemented to assess the energy performance of battery formation, allowing the implementation of rapid corrective actions towards higher efficiency standards. This resulted on the average in a 4.3% reduction of the electricity consumption in battery formation

    A study on statistical process control (SPC) in pharmaceutical contract manufacturing : potential determinants of SPC implementation success : a thesis presented in partial fulfilment of the requirements for the degree of Master of Quality Systems at Massey University, Palmerston North, Manawatu, New Zealand

    Get PDF
    Figures 2.8 (=Bou-Llusar et al., 2009 Fig 1) & 2.9 (=Jayamaha et al., 2009 Fig 1) have been removed for copyright reasons.Many organisations in New Zealand are still unfamiliar with Statistical Process Control (SPC) or how to implement it successfully, even though SPC has been widely used in other countries such as Japan, with great success. The potential of SPC has been underestimated in most cases, especially in the Pharmaceutical industry that share some common ground with SPC (e.g. measurement and analysis). It is not just control charts that make SPC successful. A control chart is just a tool (and a useful one at that) being used in the practice of SPC. There are some critical factors or enablers of successful SPC implementation. The context of this study is pharmaceutical contract manufacturing (PCM). The aims of this research are to identify a suitable SPC programme for PCM companies, identify enablers of successful SPC implementation, and to understand how these enablers cause quality improvement. A single case study was designed to verify the applicability of SPC. The research confirmed the suitability of a specific six-step SPC implementation approach mentioned in the literature. SPC was found to be a suitable technique for identifying and understanding process variation in PCM. The case study findings are useful to quality practitioners as well as PCM companies contemplating on implementing SPC. Regarding theoretical contributions, the researcher developed two theoretical models from extant literature and both models were empirically tested using survey data collected from 76 respondents using the state-of-the art in multivariate latent variable path modelling. The model test results showed that top management commitment has both a direct effect as well as an indirect effect on quality performance through other SPC/TQM enablers. The results also showed that soft factors of TQM/SPC are significantly more influential than hard factors of TQM/SPC in achieving quality performance. While this is nothing new, one thing that is novel in this research is the way researcher modelled hard and soft factors of TQM/SPC to estimate how much more important soft factors are than hard factors, in achieving quality performance. The limitations of the study and suggestions for future research have been provided at the end of this thesis (Chapter 6)

    Short Production Run Control Charts to Monitor Process Variances

    Get PDF
    Control chart is one of the most commonly used statistical tools for quality control and improvement. If the process mean and standard deviation are not given or unknown, most Shewhart control charts require sufficient sample data before the control chart can be established. However, in certain industries or processes, it may not be practical to collect adequate amount of data at the beginning of the manufacturing process to build the trial control chart in Phase I. For quality improvement in such or similar processes, some authors developed self-starting control charts for short-run production, e.g. t chart, Q chart, EWMA t chart/Q chart, CUSUM t chart/Q chart. This thesis studies the performance of some short run control charts for monitoring process variances. Numerical simulations are using in this study. The results of the numerical experiments are extensively tested for different combinations of process lengths and starting points of process shifts

    The use of statistics in understanding pharmaceutical manufacturing processes

    Get PDF
    D.Eng.Industrial manufacturing processes for pharmaceutical products require a high level of understanding and control to demonstrate that the final product will be of the required quality to be taken by the patient. A large amount of data is typically collected throughout manufacture from sensors located around reaction vessels. This data has the potential to provide a significant amount of information about the variation inherent within the process and how it impacts on product quality. However to make use of the data, appropriate statistical methods are required to extract the information that is contained. Industrial process data presents a number of challenges, including large quantities, variable sampling rates, process noise and non-linear relationships. The aim of this thesis is to investigate, develop and apply statistical methodologies to data collected from the manufacture of active pharmaceutical ingredients (API), to increase the level of process and product understanding and to identify potential areas for improvement. Individual case studies are presented of investigations into API manufacture. The first considers prediction methods to estimate the drying times of a batch process using data collected early in the process. Good predictions were achieved by selecting a small number of variables as inputs, rather than data collected throughout the process. A further study considers the particle size distribution (PSD) of a product. Multivariate analysis techniques proved efficient at summarising the PSD data, to provide an understanding of the sources of variation and highlight the difference between two processing plants. Process capability indices (PCIs) are an informative tool to estimate the risk of a process failing a specification limit. PCIs are assessed and developed to be applied to data that does not follow a standard normal distribution. Calculating the capability from the percentiles of the data or the proportion of data outside of the specification limits has the potential to generate information about the capability of the process. Finally, the application of Bayesian statistical methods in pharmaceutical process development are investigated, including experimental design, process validation and process capability. A novel Bayesian method is developed to sequentially calculate the process capability when data is collected in blocks over time, thereby reducing the level of noise caused by small sample sizes

    Model-Based State Estimation for Fault Detection under Disturbance

    Get PDF
    The measurement of process states is critical for process monitoring, advanced process control, and process optimization. For chemical processes where state information cannot be measured directly, techniques such as state estimation need to be developed. Model-based state estimation is one of the most widely applied methods for estimation of unmeasured states basing on a high-fidelity process model. However, certain disturbances or unknown inputs not considered by process models will generate model-plant mismatch. In this dissertation, different model-based process monitoring techniques are developed and applied for state estimation under uncertainty and disturbance. Case studies are performed to demonstrate the proposed methods. The first case study estimates leak location from a natural gas pipeline. Non-isothermal state equations are derived for natural gas pipeline flow processes. A dual unscented Kalman filter is used for parameter estimation and flow rate estimation. To deal with sudden process disturbance in the natural gas pipeline, an unknown input observer is designed. The proposed design implements a linear unknown input observer with time-delays that considers changes of temperature and pressure as unknown inputs and includes measurement noise in the process. Simulation of a natural gas pipeline with time-variant consumer usage is performed. New optimization method for detection of simultaneous multiple leaks from a natural gas pipeline is demonstrated. Leak locations are estimated by solving a global optimization problem. The global optimization problem contains constraints of linear and partial differential equations, integer variable, and continuous variable. An adaptive discretization approach is designed to search for the leak locations. In a following case study, a new design of a nonlinear unknown input observer is proposed and applied to estimate states in a bioreactor. The design of such an observer is provided, and sufficient and necessary conditions of the observer are discussed. Experimental studies of batch and fed-batch operation of a bioreactor are performed using Saccharomyces cerevisiae strain mutant SM14 to produce β-carotene. The state estimation of the process from the designed observer is demonstrated to alleviate the model-plant mismatch and is compared to the experimental measurements

    Plantwide simulation and monitoring of offshore oil and gas production facility

    Get PDF
    Monitoring is one of the major concerns in offshore oil and gas production platform since the access to the offshore facilities is difficult. Also, it is quite challenging to extract oil and gas safely in such a harsh environment, and any abnormalities may lead to a catastrophic event. The process data, including all possible faulty scenarios, is required to build an appropriate monitoring system. Since the plant wide process data is not available in the literature, a dynamic model and simulation of an offshore oil and gas production platform is developed by using Aspen HYSYS. Modeling and simulations are handy tools for designing and predicting the accurate behavior of a production plant. The model was built based on the gas processing plant at the North Sea platform reported in Voldsund et al. (2013). Several common faults from different fault categories were simulated in the dynamic system, and their impacts on the overall hydrocarbon production were analyzed. The simulated data are then used to build a monitoring system for each of the faulty states. A new monitoring method has been proposed by combining Principal Component Analysis (PCA) and Dynamic PCA (DPCA) with Artificial Neural Network (ANN). The application of ANN to process systems is quite difficult as it involves a very large number of input neurons to model the system. Training of such large scale network is time-consuming and provides poor accuracy with a high error rate. In PCA-ANN and DPCA-ANN monitoring system, PCA and DPCA are used to reduce the dimension of the training data set and extract the main features of measured variables. Subsequently ANN uses this lower-dimensional score vectors to build a training model and classify the abnormalities. It is found that the proposed approach reduces the time to train ANN and successfully diagnose, detects and classifies the faults with a high accuracy rate

    Novas Abordagens do Controlo Estatístico do Processo: Carta ln(S2), Capacidade do Processo e Cartas Conjuntas

    Get PDF
    A evolução do mercado, da sociedade e da indústria significam que, agora, mais que nunca, as empresas devem usar todas as ferramentas ao seu dispor para se manterem competitivas no mundo. Desta forma, para garantir a competitividade, as organizações devem realizar o controlo da qualidade dos seus produtos e processos. Ao longo do tempo o controlo da qualidade foi evoluindo, surgindo no século XX pela mão de Shewhart o Controlo Estatístico do Processo (SPC), que visa o controlo de características da qualidade, monitorizando o processo e garantindo produção dentro das especificações técnicas estabelecidas. De forma a implementar as cartas tradicionais de Shewart, é necessário garantir, entre outros pressupostos, a normalidade dos dados. No controlo da dispersão do processo, esse pressuposto é violado, o que pode enviesar os resultados e as conclusões a retirar da análise dos resultados. Nesse sentido, esta dissertação tem como objetivo verificar a não-Normalidade das cartas de controlo da dispersão tradicionais (R e S), bem como realizar a determinação e validação dos parâmetros da carta de controlo ln(S2), sugerida por Pacheco (2019). O estudo da capacidade do processo é realizado no final da Fase 1 do SPC de forma a garantir que o processo tem capacidade de produzir segundo especificação pré-definida de forma consistente. Posteriormente, durante a Fase 2 do SPC, o estudo da capacidade do processo não tem uma metodologia bem definida, sendo que a periodicidade da realização deste estudo depende do responsável pelo controlo estatístico. Nesse sentido, se a periodicidade do estudo de capacidade do processo for demasiado longa, pode ocorrer que o processo deixe de ter capacidade e, como consequência, o processo deixa de conseguir produzir, de forma consistente, dentro dos requisitos impostos, incorrendo assim em prejuízos desnecessários para a organização. Nesta dissertação são apresentadas duas metodologias de estudo de capacidade do processo em tempo real, durante a Fase 2 do SPC. A primeira metodologia tem como base uma modificação dos índices tradicionais, de forma a criar os índices IU e IL, baseados nos índices () e (). A outra metodologia desenvolvida tem como base a criação de uma carta PCIRUN, na qual são calculados os intervalos de confiança dos índices de capacidade tradicionais e que são usados para definir se o processo é capaz. O último assunto trabalhado na presente dissertação corresponde à elaboração de cartas de controlo conjuntas, que possam, em simultâneo, realizar o controlo estatístico dos parâmetros de localização e dispersão do processo, vindo esta metodologia contribuir para o aumento de ferramentas disponíveis para o controlo estatístico do processo, bem como facilitar a implementação do SPC tradicional nas organizações, uma vez que, através desta metodologia, não são necessárias duas cartas de controlo, mas apenas uma, que irá controlar estatisticamente ambos os parâmetros do processo.The evolution of the market, society and industry mean that now, more than ever, companies must use all the tools at their disposal to remain competitive in the industrial world. Thus, to guarantee competitiveness, organizations must carry out quality control. Over time, quality control has evolved, emerging in the 20th century by W. Shewhart, the Statistical Process Control (SPC), which aims to control quality characteristics, monitoring the process and ensuring that the product is being produced within the established specifications. To implement the traditional Shewart charts, it is necessary to ensure, among other assumptions, that the data follows a Normal distribution. In the control of the dispersion of the process, this assumption is violated, which can skew the results and the conclusions to be drawn from the analysis of the results. In this sense, this dissertation aims to verify the non-normality of the traditional dispersion control charts (R and S), as well as to perform the determination and validation of the parameters of the new ln(S2) control chart, suggested by Pacheco (2019). Regarding the study of the process capacity, this is carried out at the end of Phase 1 of SPC to ensure that the process has the capacity to produce according to a pre-defined specification in a consistent manner. Subsequently, during Phase 2 of the statistical control, the study of the process capacity does not have a well-defined methodology, and the frequency of carrying out this study depends on the engineer responsible for the statistical control. In this sense, if the periodicity of the process capacity study is too long, it may happen that the process has no capacity and, as a consequence, the process is unable to produce consistently, within the imposed requirements, thus incurring losses unnecessary for the organization. In order to solve this problem, two methodologies for studying the capacity of the process in real time during Phase 2 of statistical control are proposed in this thesis. The first methodology is based on the principles used in the study of the process capacity in the Short Run charts, having carried out a modification of the traditional indices, in order to create the IU and IL, based on () and (). The other methodology developed is based on the creation of a PCIRUN chart, in which the confidence intervals of the traditional capacity indexes and are calculated. Depending on the location of the limits of the confidence intervals of these limits, it is possible to study the capacity of the process, being that, if the process has capacity, it is possible to study the performance of the process (if the process is producing in a statistically centered way in relation to the requirements). The last subject worked on in this thesis corresponds to the elaboration of joint control charts, which can, simultaneously, carry out the statistical control of the process location and dispersion parameters. This methodology contributes to the increase of tools available for the statistical control of the process. process, as well as facilitates the implementation of traditional SPC in organizations, since, through this methodology, it’s not necessary to create two control charts, but only one, which will statistically control both parameters of the process

    Develop a Lean project management framework for the construction companies in order to improve the time and cost efficiencies of their construction operations

    Get PDF
    This research explores how Lean management principles, tools and techniques could be used in conjunction with project management theories to develop a Lean project management framework for the Saudi construction firms in order to improve the time and cost efficiencies of their operations. It critically evaluates the applicability of Lean's philosophy in the construction industry. It also investigates the management implication of traditional project management practices and the performance-improvement potential of pro-Lean practices. It examines how the adoption of Lean's Kanban can help firms to address the problem of time and cost overruns. It uses Simio 10 computer simulation software to simulate the impact of Kanban's adoption on the time and cost efficiencies of construction activities using 54 real-life scenarios. It identifies the key drivers, enablers and barriers to the adoption of Kanban in the Saudi construction industry using semi-structured interviews and web-based survey questionnaires. This research has found that construction firms in Saudi Arabia are struggling with the management of delays and that their activities are characterised by cost and schedule overruns. Saudi Arabia has, for many years, invested billions of pounds in infrastructure projects, particularly on transport initiatives such as new roads, ports and bridges. However, not many of these initiatives have been completed in time or on budget. In fact, delays and cost overruns are so common in the country that they have become seen as the norm and as a reality that project managers should just accept. This research has also found that the principles of 'Lean Construction' and Kanban's 6 rules in particular offer a viable solution to many of the current problems of Saudi construction firms. Kanban enables construction companies to switch from push-based systems to pull-based systems which involve real-time monitoring of consumption and demand-triggered replenishment. Kanban's adoption also ensures that the capacities of upstream and downstream processes are perfectly aligned, which helps firms to reduce bottlenecks, avoid backlogs and fine-tune their processes and construction activities. Moreover, this research has found that Lean construction's success depends heavily on senior management's commitment and also on staff training and understanding of the technical requirements of Lean's systems. In fact, lack of commitment and senior managers' short-sighted investment policies are identified as the two most significant barriers to Lean construction's principles' adoption and operationalisation.Ministry of Education in Saudi Arabi

    Events Recognition System for Water Treatment Works

    Get PDF
    The supply of drinking water in sufficient quantity and required quality is a challenging task for water companies. Tackling this task successfully depends largely on ensuring a continuous high quality level of water treatment at Water Treatment Works (WTW). Therefore, processes at WTWs are highly automated and controlled. A reliable and rapid detection of faulty sensor data and failure events at WTWs processes is of prime importance for its efficient and effective operation. Therefore, the vast majority of WTWs operated in the UK make use of event detection systems that automatically generate alarms after the detection of abnormal behaviour on observed signals to ensure an early detection of WTW’s process failures. Event detection systems usually deployed at WTWs apply thresholds to the monitored signals for the recognition of WTW’s faulty processes. The research work described in this thesis investigates new methods for near real-time event detection at WTWs by the implementation of statistical process control and machine learning techniques applied for an automated near real-time recognition of failure events at WTWs processes. The resulting novel Hybrid CUSUM Event Recognition System (HC-ERS) makes use of new online sensor data validation and pre-processing techniques and utilises two distinct detection methodologies: first for fault detection on individual signals and second for the recognition of faulty processes and events at WTWs. The fault detection methodology automatically detects abnormal behaviour of observed water quality parameters in near real-time using the data of the corresponding sensors that is online validated and pre-processed. The methodology utilises CUSUM control charts to predict the presence of faults by tracking the variation of each signal individually to identify abnormal shifts in its mean. The basic CUSUM methodology was refined by investigating optimised interdependent parameters for each signal individually. The combined predictions of CUSUM fault detection on individual signals serves the basis for application of the second event detection methodology. The second event detection methodology automatically identifies faults at WTW’s processes respectively failure events at WTWs in near real-time, utilising the faults detected by CUSUM fault detection on individual signals beforehand. The method applies Random Forest classifiers to predict the presence of an event at WTW’s processes. All methods have been developed to be generic and generalising well across different drinking water treatment processes at WTWs. HC-ERS has proved to be effective in the detection of failure events at WTWs demonstrated by the application on real data of water quality signals with historical events from a UK’s WTWs. The methodology achieved a peak F1 value of 0.84 and generates 0.3 false alarms per week. These results demonstrate the ability of method to automatically and reliably detect failure events at WTW’s processes in near real-time and also show promise for practical application of the HC-ERS in industry. The combination of both methodologies presents a unique contribution to the field of near real-time event detection at WTW
    corecore