4,712 research outputs found
Recommended from our members
Gaussian process regression for virtual metrology of microchip quality and the resulting strategic sampling scheme
Manufacturing of integrated circuits involves many sequential processes, often ex- ecuted to nanoscale tolerances, and the yield depends on the often unmeasured quality of intermediate steps. In the high-throughput industry of fabricating microelectronics on semi-conducting wafers, scheduling measurements of product quality before the electrical test of the complete IC can be expensive. We therefore seek to predict metrics of product quality based on sensor readings describing the environment within the relevant tool during the processing of each wafer, or to apply the concept of virtual metrology (VM) to monitor these intermediate steps. We model the data using Gaussian process regression (GPR), adapted to simultaneously learn the nonlinear dynamics that govern the quality characteristic, as well as their operating space, expressed by a linear embedding of the sensor traces’ features. Such Bayesian models predict a distribution for the target metric, such as a critical dimension, so one may assess the model’s credibility through its predictive uncertainty. Assuming measurements of the quality characteristic of interest are budgeted, we seek to hasten convergence of the GPR model to a credible form through an active sampling scheme, whereby the predictive uncertainty informs which wafer’s quality to measure next. We evaluate this convergence when predicting and updating online, as if in a factory, using a large dataset for plasma-enhanced chemical vapor deposition (PECVD), with measured thicknesses for ~32,000 wafers. By approximately optimizing the information extracted from this seemingly repetitive data describing a tightly controlled process, GPR achieves ~10% greater accuracy on average than a baseline linear model based on partial least squares (PLS). In a derivative study, we seek to discern the degree of drift in the process over the several months the data spans. We express this drift by how unusual the relevant features, as embedded by the GPR model, appear as the in- puts compensate for degrading conditions. This method detects the onset of consistently unusual behavior that extends to a bimodal thickness fault, anticipating its flagging by as much as two days.Mechanical Engineerin
Survey of Distributed Decision
We survey the recent distributed computing literature on checking whether a
given distributed system configuration satisfies a given boolean predicate,
i.e., whether the configuration is legal or illegal w.r.t. that predicate. We
consider classical distributed computing environments, including mostly
synchronous fault-free network computing (LOCAL and CONGEST models), but also
asynchronous crash-prone shared-memory computing (WAIT-FREE model), and mobile
computing (FSYNC model)
A Survey on Security and Privacy of 5G Technologies: Potential Solutions, Recent Advancements, and Future Directions
Security has become the primary concern in many telecommunications industries today as risks can have high consequences. Especially, as the core and enable technologies will be associated with 5G network, the confidential information will move at all layers in future wireless systems. Several incidents revealed that the hazard encountered by an infected wireless network, not only affects the security and privacy concerns, but also impedes the complex dynamics of the communications ecosystem. Consequently, the complexity and strength of security attacks have increased in the recent past making the detection or prevention of sabotage a global challenge. From the security and privacy perspectives, this paper presents a comprehensive detail on the core and enabling technologies, which are used to build the 5G security model; network softwarization security, PHY (Physical) layer security and 5G privacy concerns, among others. Additionally, the paper includes discussion on security monitoring and management of 5G networks. This paper also evaluates the related security measures and standards of core 5G technologies by resorting to different standardization bodies and provide a brief overview of 5G standardization security forces. Furthermore, the key projects of international significance, in line with the security concerns of 5G and beyond are also presented. Finally, a future directions and open challenges section has included to encourage future research.European CommissionNational Research Tomsk Polytechnic UniversityUpdate citation details during checkdate report - A
Method of lines and runge-kutta method in solving partial differential equation for heat equation
Solving the differential equation for Newton’s cooling law mostly consists of several fragments formed during a long time to solve the equation. However, the stiff type problems seem cannot be solved efficiently via some of these methods. This research will try to overcome such problems and compare results from two classes of numerical methods for heat equation problems. The heat or diffusion equation, an example of parabolic equations, is classified into Partial Differential Equations. Two classes of numerical methods which are Method of Lines and Runge-Kutta will be performed and discussed. The development, analysis and implementation have been made using the Matlab language, which the graphs exhibited to highlight the accuracy and efficiency of the numerical methods. From the solution of the equations, it showed that better accuracy is achieved through the new combined method by Method of Lines and Runge-Kutta method
The design of micro-processors for digital protection of power systems
Imperial Users onl
ADVANCES ON BILINEAR MODELING OF BIOCHEMICAL BATCH PROCESSES
[EN] This thesis is aimed to study the implications of the statistical modeling approaches proposed for the bilinear modeling of batch processes, develop new techniques to overcome some of the problems that have not been yet solved and apply them to data of biochemical processes. The study, discussion and development of the new methods revolve around the four steps of the modeling cycle, from the alignment, preprocessing and calibration of batch data to the monitoring of batches trajectories. Special attention is given to the problem of the batch synchronization, and its effect on the modeling from different angles.
The manuscript has been divided into four blocks. First, a state-of- the-art of the latent structures based-models in continuous and batch processes and traditional univariate and multivariate statistical process control systems is carried out.
The second block of the thesis is devoted to the preprocessing of batch data, in particular, to the equalization and synchronization of batch trajectories. The first section addresses the problem of the lack of equalization in the variable trajectories. The different types of unequalization scenarios that practitioners might finnd in batch processes are discussed and the solutions to equalize batch data are introduced. In the second section, a theoretical study of the nature of batch processes and of the synchronization of batch trajectories as a prior step to bilinear modeling is carried out. The topics under discussion are i) whether the same synchronization approach must be applied to batch data in presence of different types of asynchronisms, and ii) whether synchronization is always required even though the length of the variable trajectories are constant across batches. To answer these questions, a thorough study of the most common types of asynchronisms that may be found in batch data is done. Furthermore, two new synchronization techniques are proposed to solve the current problems in post-batch and real-time synchronization. To improve fault detection and classification, new unsupervised control charts and supervised fault classifiers based on the information generated by the batch synchronization are also proposed.
In the third block of the manuscript, a research work is performed on the parameter stability associated with the most used synchronization methods and principal component analysis (PCA)-based Batch Multivariate Statistical Process Control methods. The results of this study have revealed that accuracy in batch synchronization has a profound impact on the PCA model parameters stability. Also, the parameter stability is closely related to the type of preprocessing performed in batch data, and the type of model and unfolding used to transform the three-way data structure to two-way. The setting of the parameter stability, the source of variability remaining after preprocessing and the process dynamics should be balanced in such a way that multivariate statistical models are accurate in fault detection and diagnosis and/or in online prediction.
Finally, the fourth block introduces a graphical user-friendly interface developed in Matlab code for batch process understanding and monitoring. To perform multivariate analysis, the last developments in process chemometrics, including the methods proposed in this thesis, are implemented.[ES] La presente tesis doctoral tiene como objetivo estudiar las implicaciones de los métodos estadísticos propuestos para la modelización bilineal de procesos por lotes, el desarrollo de nuevas técnicas para solucionar algunos de los problemas más complejos aún por resolver en esta línea de investigación y aplicar los nuevos métodos a datos provenientes de procesos bioquímicos para su evaluación estadística. El estudio, la discusión y el desarrollo de los nuevos métodos giran en torno a las cuatro fases del ciclo de modelización: desde la sincronización, ecualización, preprocesamiento y calibración de los datos, a la monitorización de las trayectorias de las variables del proceso. Se presta especial atención al problema de la sincronización y su efecto en la modelización estadística desde distintas perspectivas.
El manuscrito se ha dividido en cuatro grandes bloques. En primer lugar, se realiza una revisión bibliográfica de las técnicas de proyección sobre estructuras latentes para su aplicación en procesos continuos y por lotes, y del diseño de sistemas de control basados en modelos estadísticos multivariantes.
El segundo bloque del documento versa sobre el preprocesamiento de los datos, en concreto, sobre la ecualización y la sincronización. La primera parte aborda el problema de la falta de ecualización en las trayectorias de las variables. Se discuten las diferentes políticas de muestreo que se pueden encontrar en procesos por lotes y las soluciones para ecualizar las variables. En la segunda parte de esta sección, se realiza un estudio teórico sobre la naturaleza de los procesos por lotes y de la sincronización de las trayectorias como paso previo a la modelización bilineal. Los temas bajo discusión son: i) si se debe utilizar el mismo enfoque de sincronización en lotes afectados por diferentes tipos de asincronismos, y ii) si la sincronización es siempre necesaria aún y cuando las trayectorias de las variables tienen la misma duración en todos los lotes. Para responder a estas preguntas, se lleva a cabo un estudio exhaustivo de los tipos más comunes de asincronismos que se pueden encontrar en este tipo de datos. Además, se proponen dos nuevas técnicas de sincronización para resolver los problemas existentes en aplicaciones post-morten y en tiempo real. Para mejorar la detección de fallos y la clasificación, también se proponen nuevos gráficos de control no supervisados y clasificadores de fallos supervisados en base a la información generada por la sincronización de los lotes.
En el tercer bloque del manuscrito se realiza un estudio de la estabilidad de los parámetros asociados a los métodos de sincronización y a los métodos estadístico multivariante basados en el Análisis de Componentes Principales (PCA) más utilizados para el control de procesos. Los resultados de este estudio revelan que la precisión de la sincronización de las trayectorias tiene un impacto significativo en la estabilidad de los parámetros de los modelos PCA. Además, la estabilidad paramétrica está estrechamente relacionada con el tipo de preprocesamiento realizado en los datos de los lotes, el tipo de modelo a justado y el despliegue utilizado para transformar la estructura de datos de tres a dos dimensiones. El ajuste de la estabilidad de los parámetros, la fuente de variabilidad que queda después del preprocesamiento de los datos y la captura de las dinámicas del proceso deben ser a justados de forma equilibrada de tal manera que los modelos
estadísticos multivariantes sean precisos en la detección y diagnóstico de fallos y/o en la predicción en tiempo real.
Por último, el cuarto bloque del documento describe una interfaz gráfica de usuario que se ha desarrollado en código Matlab para la comprensión y la supervisión de procesos por lotes. Para llevar a cabo los análisis multivariantes, se han implementado los últimos desarrollos en la quimiometría de proc[CA] Aquesta tesi doctoral te com a objectiu estudiar les implicacions dels mètodes de modelització estadística proposats per a la modelització bilineal de processos per lots, el desenvolupament de noves tècniques per resoldre els problemes encara no resolts en aquesta línia de recerca i aplicar els nous mètodes a les dades dels processos bioquímics. L'estudi, la discussió i el desenvolupament dels nous mètodes giren entorn a les quatre fases del cicle de modelització, des de l'alineació, preprocessament i el calibratge de les dades provinents de lots, a la monitorització de les trajectòries. Es presta especial atenció al problema de la sincronització per lots, i el seu efecte sobre el modelatge des de diferents angles.
El manuscrit s'ha dividit en quatre grans blocs. En primer lloc, es realitza una revisió bibliogràfica dels principals mètodes basats en tècniques de projecció sobre estructures latents en processos continus i per lots, així com dels sistemes de control estadístics multivariats.
El segon bloc del document es dedica a la preprocessament de les dades provinents de lots, en particular, l' equalització i la sincronització. La primera part aborda el problema de la manca d'equalització en les trajectòries de les variables. Es discuteixen els diferents tipus d'escenaris en que les variables estan mesurades a distints intervals i les solucions per equalitzar-les en processos per lots. A la segona part d'aquesta secció es porta a terme un estudi teòric de la naturalesa dels processos per lots i de la sincronització de les trajectòries de lots com a pas previ al modelatge bilineal. Els temes en discussió són: i) si el mateix enfocament de sincronització ha de ser aplicat a les dades del lot en presència de diferents tipus de asincronismes, i ii) si la sincronització sempre es requereix tot i que la longitud de les trajectòries de les variables són constants en tots el lots. Per respondre a aquestes preguntes, es du a terme un estudi exhaustiu dels tipus més comuns de asincronismes que es poden trobar en les dades provinents de lots. A més, es proposen dues noves tècniques de sincronització per resoldre els problemes existents la sincronització post-morten i en temps real. Per millorar la detecció i la classificació de anomalies, també es proposen nous gràfics de control no supervisats i classificadors de falla supervisats dissenyats en base a la informació generada per la sincronització de lots.
En el tercer bloc del manuscrit es realitza un treball de recerca sobre l'estabilitat dels paràmetres associats als mètodes de sincronització i als mètodes estadístics multivariats basats en l'Anàlisi de Components Principals (PCA) més utilitzats per al control de processos. Els resultats d'aquest estudi revelen que la precisió en la sincronització per lots te un profund impacte en l'estabilitat dels paràmetres dels models PCA. A més, l'estabilitat paramètrica està estretament relacionat amb el tipus de preprocessament realitzat en les dades provinents de lots, el tipus de model i el desplegament utilitzat per transformar l'estructura de dades de tres a dos dimensions. L'ajust de l'estabilitat dels paràmetres, la font de variabilitat que queda després del preprocessament i la captura de la dinàmica de procés ha de ser equilibrada de tal manera que els models estadístics multivariats són precisos en la detecció i diagnòstic de fallades i/o en la predicció en línia.
Finalment, el quart bloc del document introdueix una interfície gràfica d'usuari que s'ha dissenyat e implementat en Matlab per a la comprensió i la supervisió de processos per lots. Per dur a terme aquestes anàlisis multivariats, s'han implementat els últims desenvolupaments en la quimiometria de processos, incloent-hi els mètodes proposats en aquesta tesi.González Martínez, JM. (2015). ADVANCES ON BILINEAR MODELING OF BIOCHEMICAL BATCH PROCESSES [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/55684TESISPremios Extraordinarios de tesis doctorale
Improved Quantification of Important Beer Quality Parameters based on Non-linear Calibration Methods applied to FT-MIR Spectra
During the production process of beer, it is of utmost importance to guarantee a high consistency of the beer quality. For instance, the bitterness is an essential quality parameter which has to be controlled within the specifications already at the beginning of the production process in the unfermented beer (wort) as well as in final products such as beer and beer mix beverages. Nowadays, analytical techniques for quality control in beer production are mainly based on manual supervision, i.e. samples are taken from the process and analyzed in the laboratory. This typically requires significant lab technicians efforts for only a small fraction of samples to be analyzed, which leads to significant costs for beer breweries and companies. Fourier transform mid-infrared (FT-MIR) spectroscopy was used in combination with non-linear multivariate calibration techniques to overcome (i) the time consuming off-line analyses in beer production and (ii) already known limitations of standard linear chemometric methods , like partial least squares (PLS), for important quality parameters [1][2] such as bitterness, citric acid, total acids, free amino nitrogen, final attenuation or foam stability. The calibration models are established with enhanced non-linear techniques based (i) on a new piece-wise linear version of PLS by employing fuzzy rules for local partitioning the latent variable space and (ii) on extensions of support vector regression variants (ε-PLSSVR and ν-PLSSVR), for overcoming high computation times in high-dimensional problems and time-intensive and inappropriate settings of the kernel parameters. Furthermore, we introduce a new model selection scheme based on bagged ensembles in order to improve robustness and thus predictive quality of the final models. The approaches are tested on real-world calibration data sets for wort and beer mix beverages, and successfully compared to linear methods, as showing a clear out-performance in most cases and being able to meet the model quality requirements defined by the experts at the beer company
Low complexity physical layer security approach for 5G internet of things
Fifth-generation (5G) massive machine-type communication (mMTC) is expected to support the cellular adaptation of internet of things (IoT) applications for massive connectivity. Due to the massive access nature, IoT is prone to high interception probability and the use of conventional cryptographic techniques in these scenarios is not practical considering the limited computational capabilities of the IoT devices and their power budget. This calls for a lightweight physical layer security scheme which will provide security without much computational overhead and/or strengthen the existing security measures. Here a shift based physical layer security approach is proposed which will provide a low complexity security without much changes in baseline orthogonal frequency division multiple access (OFDMA) architecture as per the low power requirements of IoT by systematically rearranging the subcarriers. While the scheme is compatible with most fast Fourier transform (FFT) based waveform contenders which are being proposed in 5G especially in mMTC and ultra-reliable low latency communication (URLLC), it can also add an additional layer of security at physical layer to enhanced mobile broadband (eMBB)
Virtual metrology for plasma etch processes.
Plasma processes can present dicult control challenges due to time-varying dynamics
and a lack of relevant and/or regular measurements. Virtual metrology (VM) is the
use of mathematical models with accessible measurements from an operating process to
estimate variables of interest. This thesis addresses the challenge of virtual metrology
for plasma processes, with a particular focus on semiconductor plasma etch.
Introductory material covering the essentials of plasma physics, plasma etching, plasma
measurement techniques, and black-box modelling techniques is rst presented for readers
not familiar with these subjects. A comprehensive literature review is then completed
to detail the state of the art in modelling and VM research for plasma etch processes.
To demonstrate the versatility of VM, a temperature monitoring system utilising a
state-space model and Luenberger observer is designed for the variable specic impulse
magnetoplasma rocket (VASIMR) engine, a plasma-based space propulsion system. The
temperature monitoring system uses optical emission spectroscopy (OES) measurements
from the VASIMR engine plasma to correct temperature estimates in the presence of
modelling error and inaccurate initial conditions. Temperature estimates within 2% of
the real values are achieved using this scheme.
An extensive examination of the implementation of a wafer-to-wafer VM scheme to estimate
plasma etch rate for an industrial plasma etch process is presented. The VM
models estimate etch rate using measurements from the processing tool and a plasma
impedance monitor (PIM). A selection of modelling techniques are considered for VM
modelling, and Gaussian process regression (GPR) is applied for the rst time for VM
of plasma etch rate. Models with global and local scope are compared, and modelling
schemes that attempt to cater for the etch process dynamics are proposed. GPR-based
windowed models produce the most accurate estimates, achieving mean absolute percentage
errors (MAPEs) of approximately 1:15%. The consistency of the results presented
suggests that this level of accuracy represents the best accuracy achievable for
the plasma etch system at the current frequency of metrology.
Finally, a real-time VM and model predictive control (MPC) scheme for control of
plasma electron density in an industrial etch chamber is designed and tested. The VM
scheme uses PIM measurements to estimate electron density in real time. A predictive
functional control (PFC) scheme is implemented to cater for a time delay in the VM
system. The controller achieves time constants of less than one second, no overshoot,
and excellent disturbance rejection properties. The PFC scheme is further expanded by
adapting the internal model in the controller in real time in response to changes in the
process operating point
Virtual metrology for plasma etch processes.
Plasma processes can present dicult control challenges due to time-varying dynamics
and a lack of relevant and/or regular measurements. Virtual metrology (VM) is the
use of mathematical models with accessible measurements from an operating process to
estimate variables of interest. This thesis addresses the challenge of virtual metrology
for plasma processes, with a particular focus on semiconductor plasma etch.
Introductory material covering the essentials of plasma physics, plasma etching, plasma
measurement techniques, and black-box modelling techniques is rst presented for readers
not familiar with these subjects. A comprehensive literature review is then completed
to detail the state of the art in modelling and VM research for plasma etch processes.
To demonstrate the versatility of VM, a temperature monitoring system utilising a
state-space model and Luenberger observer is designed for the variable specic impulse
magnetoplasma rocket (VASIMR) engine, a plasma-based space propulsion system. The
temperature monitoring system uses optical emission spectroscopy (OES) measurements
from the VASIMR engine plasma to correct temperature estimates in the presence of
modelling error and inaccurate initial conditions. Temperature estimates within 2% of
the real values are achieved using this scheme.
An extensive examination of the implementation of a wafer-to-wafer VM scheme to estimate
plasma etch rate for an industrial plasma etch process is presented. The VM
models estimate etch rate using measurements from the processing tool and a plasma
impedance monitor (PIM). A selection of modelling techniques are considered for VM
modelling, and Gaussian process regression (GPR) is applied for the rst time for VM
of plasma etch rate. Models with global and local scope are compared, and modelling
schemes that attempt to cater for the etch process dynamics are proposed. GPR-based
windowed models produce the most accurate estimates, achieving mean absolute percentage
errors (MAPEs) of approximately 1:15%. The consistency of the results presented
suggests that this level of accuracy represents the best accuracy achievable for
the plasma etch system at the current frequency of metrology.
Finally, a real-time VM and model predictive control (MPC) scheme for control of
plasma electron density in an industrial etch chamber is designed and tested. The VM
scheme uses PIM measurements to estimate electron density in real time. A predictive
functional control (PFC) scheme is implemented to cater for a time delay in the VM
system. The controller achieves time constants of less than one second, no overshoot,
and excellent disturbance rejection properties. The PFC scheme is further expanded by
adapting the internal model in the controller in real time in response to changes in the
process operating point
- …