676 research outputs found

    System identification and structural health monitoring of bridge structures

    Get PDF
    This research study addresses two issues for the identification of structural characteristics of civil infrastructure systems. The first one is related to the problem of dynamic system identification, by means of experimental and operational modal analysis, applied to a large variety of bridge structures. Based on time and frequency domain techniques and mainly with output-only acceleration, velocity or strain data, modal parameters have been estimated for suspension bridges, masonry arch bridges, concrete arch and continuous bridges, reticular and box girder steel bridges. After giving an in-depth overview of standard and advanced stochastic methods, differences of the existing approaches in their performances are highlighted during system identification on the different kinds of civil infrastructures. The evaluation of their performance is accompanied by easy and hard determinable cases, which gave good results only after performing advanced clustering analysis. Eventually, real-time vibration-based structural health monitoring algorithms are presented during their performance in structural damage detection by statistical models. The second issue is the noise-free estimation of high order displacements taking place on suspension bridges. Once provided a comprehensive treatment of displacement and acceleration data fusion for dynamic systems by defining the Kalman filter algorithm, the combination of these two kinds of measurements is achieved, improving the deformations observed. Thus, an exhaustive analysis of smoothed displacement data on a suspension bridge is presented. The successful tests were subsequently used to define the non-collocated sensor monitoring problem with the application on simplified model

    Statistical Methods for Semiconductor Manufacturing

    Get PDF
    In this thesis techniques for non-parametric modeling, machine learning, filtering and prediction and run-to-run control for semiconductor manufacturing are described. In particular, algorithms have been developed for two major applications area: - Virtual Metrology (VM) systems; - Predictive Maintenance (PdM) systems. Both technologies have proliferated in the past recent years in the semiconductor industries, called fabs, in order to increment productivity and decrease costs. VM systems aim of predicting quantities on the wafer, the main and basic product of the semiconductor industry, that may be physically measurable or not. These quantities are usually ’costly’ to be measured in economic or temporal terms: the prediction is based on process variables and/or logistic information on the production that, instead, are always available and that can be used for modeling without further costs. PdM systems, on the other hand, aim at predicting when a maintenance action has to be performed. This approach to maintenance management, based like VM on statistical methods and on the availability of process/logistic data, is in contrast with other classical approaches: - Run-to-Failure (R2F), where there are no interventions performed on the machine/process until a new breaking or specification violation happens in the production; - Preventive Maintenance (PvM), where the maintenances are scheduled in advance based on temporal intervals or on production iterations. Both aforementioned approaches are not optimal, because they do not assure that breakings and wasting of wafers will not happen and, in the case of PvM, they may lead to unnecessary maintenances without completely exploiting the lifetime of the machine or of the process. The main goal of this thesis is to prove through several applications and feasibility studies that the use of statistical modeling algorithms and control systems can improve the efficiency, yield and profits of a manufacturing environment like the semiconductor one, where lots of data are recorded and can be employed to build mathematical models. We present several original contributions, both in the form of applications and methods. The introduction of this thesis will be an overview on the semiconductor fabrication process: the most common practices on Advanced Process Control (APC) systems and the major issues for engineers and statisticians working in this area will be presented. Furthermore we will illustrate the methods and mathematical models used in the applications. We will then discuss in details the following applications: - A VM system for the estimation of the thickness deposited on the wafer by the Chemical Vapor Deposition (CVD) process, that exploits Fault Detection and Classification (FDC) data is presented. In this tool a new clustering algorithm based on Information Theory (IT) elements have been proposed. In addition, the Least Angle Regression (LARS) algorithm has been applied for the first time to VM problems. - A new VM module for multi-step (CVD, Etching and Litography) line is proposed, where Multi-Task Learning techniques have been employed. - A new Machine Learning algorithm based on Kernel Methods for the estimation of scalar outputs from time series inputs is illustrated. - Run-to-Run control algorithms that employ both the presence of physical measures and statistical ones (coming from a VM system) is shown; this tool is based on IT elements. - A PdM module based on filtering and prediction techniques (Kalman Filter, Monte Carlo methods) is developed for the prediction of maintenance interventions in the Epitaxy process. - A PdM system based on Elastic Nets for the maintenance predictions in Ion Implantation tool is described. Several of the aforementioned works have been developed in collaborations with major European semiconductor companies in the framework of the European project UE FP7 IMPROVE (Implementing Manufacturing science solutions to increase equiPment pROductiVity and fab pErformance); such collaborations will be specified during the thesis, underlying the practical aspects of the implementation of the proposed technologies in a real industrial environment

    Data-driven Soft Sensors in the Process Industry

    Get PDF
    In the last two decades Soft Sensors established themselves as a valuable alternative to the traditional means for the acquisition of critical process variables, process monitoring and other tasks which are related to process control. This paper discusses characteristics of the process industry data which are critical for the development of data-driven Soft Sensors. These characteristics are common to a large number of process industry fields, like the chemical industry, bioprocess industry, steel industry, etc. The focus of this work is put on the data-driven Soft Sensors because of their growing popularity, already demonstrated usefulness and huge, though yet not completely realised, potential. A comprehensive selection of case studies covering the three most important Soft Sensor application fields, a general introduction to the most popular Soft Sensor modelling techniques as well as a discussion of some open issues in the Soft Sensor development and maintenance and their possible solutions are the main contributions of this work

    Supervisory Wireless Control for Critical Industrial Applications

    Get PDF

    State estimators in soft sensing and sensor fusion for sustainable manufacturing

    Get PDF
    State estimators, including observers and Bayesian filters, are a class of model-based algorithms for estimating variables in a dynamical system given sensor measurements of related system states. They can be used to derive fast and accurate estimates of system variables which cannot be measured directly (’soft sensing’) or for which only noisy, intermittent, delayed, indirect or unreliable measurements are available, perhaps from multiple sources (’sensor fusion’). In this paper we introduce the concepts and main methods of state estimation and review recent applications in improving the sustainability of manufacturing processes. It is shown that state estimation algorithms can play a key role in manufacturing systems to accurately monitor and control processes to improve efficiencies, lower environmental impact, enhance product quality, improve the feasibility of processing more sustainable raw materials, and ensure safer working environments for humans. We discuss current and emerging trends in using state estimation as a framework for combining physical knowledge with other sources of data for monitoring and control of distributed manufacturing systems

    Transmission Rate Compression Based on Kalman Filter Using Spatio-temporal Correlation for Wireless Sensor Networks

    Get PDF
    Wireless sensor networks (WSNs) composed of spatially distributed autonomous sensor nodes have been applied to a wide variety of applications. Due to the limited energy budget of the sensor nodes and long-term operation requirement of the network, energy efficiency is a primary concern in almost any application. Radio communication, known as one of the most expensive processes, can be suppressed thanks to the temporal and spatial correlations. However, it is a challenge to compress the communication as much as possible, while reconstructing the system state with the highest quality. This work proposes the PKF method to compress the transmission rate for cluster based WSNs, which combines a k-step ahead Kalman predictor with a Kalman filter (KF). It provides the optimal reconstruction solution based on the compressed information of a single node for a linear system. Instead of approximating the noisy raw data, PKF aims to reconstruct the internal state of the system. It achieves data filtering, state estimation, data compression and reconstruction within one KF framework and allows the reconstructed signal based on the compressed transmission to be even more precise than transmitting all of the raw measurements without processing. The second contribution is the detailed analysis of PKF. It not only characterizes the effect of the system parameters on the performance of PKF but also supplies a common framework to analyze the underlying process of prediction-based schemes. The transmission rate and reconstruction quality are functions of the system parameters, which are calculated with the aid of (truncated) multivariate normal (MVN) distribution. The transmission of the node using PKF not only determines the current optimal estimate of the system state, but also indicates the range and the transmission probability of the k-step ahead prediction of the cluster head. Besides, one of the prominent results is an explicit expression for the covariance of the doubly truncated MVN distribution. This is the first work that calculates it using the Hessian matrix of the probability density function of a MVN distribution, which improves the traditional methods using moment-generating function and has generality. This contribution is important for WSNs, but also for other domains, e.g., statistics and economics. The PKF method is extended to use spatial correlation in multi-nodes systems without any intra-communication or a coordinator based on the above analysis. Each leaf node executes a PKF independently. The reconstruction quality is further improved by the cluster head using the received information, which is equivalent to further reduce the transmission rate of the node under the guaranteed reconstruction quality. The optimal reconstruction solution, called Rand-ST, is obtained, when the cluster head uses the incomplete information by taking the transmission of each node as random. Rand-ST actually solves the KF fusion problem with colored and randomly transmitted observations, which is the first work addressing this problem to the best of our knowledge. It proves the KF with state augment method is more accurate than the measurement differencing approach in this scenario. The suboptimality of Rand-ST by neglecting the useful information is analyzed, when the transmission of each node is controlled by PKF. The heuristic EPKF methods are thereupon proposed to utilize the complete information, while solving the nonlinear problem through linear approximations. Compared with the available techniques, EPKF methods not only ensure an error bound of the reconstruction for each node, but also allow them to report the emergency event in time, which avoids the loss of penitential important information

    Improving Satellite Leaf Area Index Estimation Based On Various Integration Methods

    Get PDF
    Leaf Area Index (LAI) is an important land surface biophysical variable that is used to characterize vegetation amount and activity. Current satellite LAI products, however, do not satisfy the requirements of the modeling community due to their large uncertainties and frequent missing values. Each LAI product is currently generated from only one satellite sensor data. There is an urgent need for advanced methods to integrate multiple LAI products to improve the product's accuracy and integrality for various applications. To meet this need, this study proposes four methods, including the Optimal Interpolation (OI), Bayesian Maximum Entropy (BME), Multi-Resolution Tree (MRT) and Empirical Orthogonal Function (EOF), to integrate multiple LAI products. Three LAI products have been considered in this study: Moderate Resolution Imaging Spectroradiometer (MODIS), Multi-angle Imaging SpectroRadiometer (MISR) and Carbon cYcle and Change in Land Observational Products from an Ensemble of Satellites (CYCLOPES) LAI. As the basis of data integration, this dissertation first validates and intercompares MODIS and CYCLOPES LAI products and also evaluates their geometric accuracies. The CYCLOPES LAI product has smoother temporal profiles and fewer spatial variations, but tends to produce spurious large errors in winter. The Locally Adjusted Cubic-spline Capping algorithm is revised to smooth multiple years' average and variance. Although OI, BME and MRT based methods have been used in other fields, this is the first research to employ them in integrating multiple LAI products. This dissertation also presents a new integration method based on EOF to solve the problem of large data volume and inconsistent temporal resolution of different datasets. High resolution LAI reference maps generated with ground measurements are used to validate these algorithms. Validation results show that all of these four methods can fill data gaps and reduce the errors of the existing LAI products. The data gaps are filled with information from adjacent pixels and background. These algorithms remove the spurious large temporal and spatial variation of the original LAI products. The combination of multiple satellite products significantly reduces bias. OI and BME can reduce the RMSE from 1.0 (MODIS) to 0.7 and reduce the bias from +0.3 (MODIS) and -0.2 (CYCLOPES) to -0.1. MRT can produce similar results with OI but with significantly improved efficiency. EOF also generates the results with the RMSE of 0.7 but zero bias. Limited ground measurement data hardly prove which methods outperform the others. OI and BME theoretically produce statistically optimal results. BME relaxes OI's linear and Gaussian assumption and explicitly considers data error, but bears a much higher computational burden. MRT has improved efficiency but needs strict assumptions on the scale transfer function. EOF requires simpler model identification, while it is more "empirical" than "statistical". The original contributions of this study mainly include: 1) a new application of several different integration methods to incorporate multiple satellite LAI products to reduce uncertainties and improve integrality, 2) an enhancement of the Locally Adjusted Cubic-spline Capping by revising the end condition, 3) a novel comprehensive comparison of MODIS C5 LAI product with other satellite products, 4) the development of a new LAI normalization scheme by assuming the linear relationship between measurement error and LAI natural variance to account for the inconsistency between products, and finally, 5) the creation of a new data integration method based on EOF

    Calibration and characterization of a low-cost wireless sensor for applications in CNC end milling

    Get PDF
    Central to creating a smart machining system is the challenge of collecting detailed information about the milling process at the tool tip. This work discusses the design, static calibration, dynamic characterization, and implementation of a low-cost wireless sensor for end-milling. Our novel strain-based sensor, called the Smart Tool, is shown to perform well in a laboratory setting with accuracy and dynamic behavior comparable to that of the Kistler 3-axis force dynamometer. The Smart Tool is capable of measuring static loads with a total measurement uncertainty of less than 3 percent full scale, but has a natural frequency of approximately 630 Hz. For this reason, signal conditioning of the strain signal is required when vibrations are large. Several techniques in signal processing are investigated to show that the sensor is useful for force estimation, chatter prediction, force model calibration, and dynamic parameter identification. The presented techniques include a discussion of the Kalman filter and Weiner filter for signal enhancement, Linear Predictive Coding for system identification, model-based filtering for force estimation, and sub-optimal linear filters for removing forced vibrations

    Development of monitoring and control systems for biotechnological processes

    Get PDF
    The field of biotechnology represents an important research area that has gained increasing success in recent times. Characterized by the involvement of biological organisms in manufacturing processes, its areas of application are broad and include the pharmaceuticals, agri-food, energy, and even waste treatment. The implication of living microorganisms represents the common element in all bioprocesses. Cell cultivations is undoubtedly the key step that requires maintaining environmental conditions in precise and defined ranges, having a significant impact on the process yield and thus on the desired product quality. The apparatus in which this process occurs is the bioreactor. Unfortunately, monitoring and controlling these processes can be a challenging task because of the complexity of the cell growth phenomenon and the limited number of variables can be monitored in real-time. The thesis presented here focuses on the monitoring and control of biotechnological processes, more specifically in the production of bioethanol by fermentation of sugars using yeasts. The study conducted addresses several issues related to the monitoring and control of the bioreactor, in which the fermentation takes place. First, the topic concerning the lack of proper sensors capable of providing online measurements of key variables (biomass, substrate, product) is investigated. For this purpose, nonlinear estimation techniques are analyzed to reconstruct unmeasurable states. In particular, the geometric observer approach is applied to select the best estimation structure and then a comparison with the extended Kalman filter is reported. Both estimators proposed demonstrate good estimation capabilities as input model parameters vary. Guaranteeing the achievement of the desired ethanol composition is the main goal of bioreactor control. To this end, different control strategies, evaluated for three different scenarios, are analzyed. The results show that the MIMO system, together with an estimator for ethanol composition, ensure the compliance with product quality. After analyzing these difficulties through numeric simulations, this research work shifts to testing a specific biotechnological process such as manufacturing bioethanol from brewery’s spent grain (BSG) as renewable waste biomass. Both acid pre-treatment, which is necessary to release sugars, and fermentation are optimized. Results show that a glucose yield of 18.12 per 100 g of dried biomass is obtained when the pre-treatment step is performed under optimized conditions (0.37 M H2SO4, 10% S-L ratio). Regarding the fermentation, T=25°C, pH=4.5, and inoculum volume equal to 12.25% v/v are selected as the best condition, at which an ethanol yield of 82.67% evaluated with respect to theoretical one is obtained. As a final step, the use of Raman spectroscopy combined with chemometric techniques such as Partial Least Square (PLS) analysis is evaluated to develop an online sensor for fermentation process monitoring. The results show that the biomass type involved significantly affects the acquired spectra, making them noisy and difficult to interpret. This represents a nontrivial limitation of the applied methodology, for which more experimental data and more robust statistical techniques could be helpful
    corecore