402 research outputs found

    Discharge Moisture Prediction of the Corn Gluten Feed Drying Process Using Machine Learning Algorithms

    Get PDF
    Modern manufacturing processes have multiple sensors (or instruments) installed that provide constant data stream outputs; however, there are some critical performance and quality variables where installing physical sensors is either impractical, expensive, not hardy enough for hostile environments or the sensor technology is not sufficiently advanced. An example of such a problem is measure moisture of solid products in real-time. If this scenario happens, Machine Learning (ML) approaches are a suitable solution as are capable of learning and representing complex relationships. ML algorithms establish a mathematical relationship between the quantity of interest and other measurable quantities such as readings from already available sensors (e.g., SCADA, historian softwares, SQL Databases, etc.). This study details how ML algorithms (Such as Multiple Linear Regression, Support Vector Machine Regression and Regression Trees) are used to predict critical variable moisture in gluten feed (a by-product of the wet-milling of maize grain for starch or ethanol production) as a simple, robust and fast solution for the lack of this variable real-time information for a corn products manufacturer. The resulting model performance demonstrates the feasibility of the ML algorithms approach to predict moisture behaviour

    Computational Intelligence Techniques for Control and Optimization of Wastewater Treatment Plants

    Get PDF
    The development of novel, practice-oriented and reliable instrumentation and control strategies for wastewater treatment plants in order to improve energy efficiency, while guaranteeing process stability and maintenance of high cleaning capacity, has become a priority for WWTP operators due to increasing treatment costs. To achieve these ambitious and even contradictory objectives, this thesis investigates a combination of online measurement systems, computational intelligence and machine learning methods as well as dynamic simulation models. Introducing the state-of-the-art in the fields of WWTP operation, process monitoring and control, three novel computational intelligence enabled instrumentation, control and automation (ICA) methods are developed and presented. Furthermore, their potential for practical implementation is assessed. The methods are, on the one hand, the automated calibration of a simulation model for the Rospe WWTP that provides a basis for the development and evaluation of the subsequent methods, and on the other hand, the development of soft sensors for the WWTP inflow which estimate the crucial process variables COD and NH4-N, and the estimation of WWTP operating states using Self- Organising Maps (SOM) that are used to determine the optimal control parameters for each state. These collectively, provide the basis for achieving comprehensive WWTP optimization. Results show that energy consumption and cleaning capacity can be improved by more than 50%

    A review of the state-of-the-art wastewater quality characterization and measurement technologies. Is the shift to real-time monitoring nowadays feasible?

    Get PDF
    Efficient characterization of wastewater stream quality is vital to ensure the safe discharge or reuse of treated wastewater (WW). There are numerous parameters employed to characterize water quality, some required by directives (e.g. biological oxygen demand (BOD), total nitrogen (TN), total phosphates (TP)), while others used for process controls (e.g. flow, temperature, pH). Well-accepted methods to assess these parameters have traditionally been laboratory-based, taking place either off-line or at-line, and presenting a significant delay between sampling and result. Alternative characterization methods can run in-line or on-line, generally being more cost-effective. Unfortunately, these methods are often not accepted when providing information to regulatory bodies. The current review aims to describe available laboratory-based approaches and compare them with innovative real-time (RT) solutions. Transitioning from laboratory-based to RT measurements means obtaining valuable process data, avoiding time delays, and the possibility to optimize the (WW) treatment management. A variety of sensor categories are examined to illustrate a general framework in which RT applications can replace longer conventional processes, with an eye toward potential drawbacks. A significant enhancement in the RT measurements can be achieved through the employment of advanced soft-sensing techniques and the Internet of Things (IoT), coupled with machine learning (ML) and artificial intelligence (AI)

    A Review of Kernel Methods for Feature Extraction in Nonlinear Process Monitoring

    Get PDF
    Kernel methods are a class of learning machines for the fast recognition of nonlinear patterns in any data set. In this paper, the applications of kernel methods for feature extraction in industrial process monitoring are systematically reviewed. First, we describe the reasons for using kernel methods and contextualize them among other machine learning tools. Second, by reviewing a total of 230 papers, this work has identified 12 major issues surrounding the use of kernel methods for nonlinear feature extraction. Each issue was discussed as to why they are important and how they were addressed through the years by many researchers. We also present a breakdown of the commonly used kernel functions, parameter selection routes, and case studies. Lastly, this review provides an outlook into the future of kernel-based process monitoring, which can hopefully instigate more advanced yet practical solutions in the process industries

    Assessment of rainfall influence over water quality at the effluent of an urban catchment by high Temporal resolution measurements

    Get PDF
    El presente estudio se trazó como objetivo establecer la influencia de la lluvia sobre la calidad del agua en el efluente de una cuenca urbana mediante el uso de medición en alta resolución temporal. Los parámetros de calidad y cantidad de agua en el efluente de la cuenca urbana estudiada (Gibraltar: Bogotá, Colombia) se midieron mediante captores con tecnología uni-paramétrica, espectrometría UV-VIS (en cuanto a calidad) y medición ultrasónica (en cuanto a cantidad). Los registros de lluvia se efectuaron a través de tres estaciones de lluvia (Fontibon, Bosa y Casablanca) en proximidades de la cuenca urbana estudiada. Mediante el uso de técnicas de procesamiento de señales y análisis de series de tiempo (filtro de medianas, Step Detection y funciones de autocorrelación-ACF-), se propuso una metodología orientada a identificar aquellos periodos durante los cuales un parámetro dado de calidad o cantidad de agua se encuentra fluctuando por fuera de su comportamiento típico durante el tiempo seco. Mediante un Análisis de Componentes Principales (PCA) se estableció de manera preliminar qué parámetros de calidad y cantidad de agua explican en mayor medida las diferencias entre la variabilidad de tiempo seco y tiempo de lluvia. Los parámetros más explicativos medidos a través de captores con tecnología uni-paramétrica y ultrasónica fueron: (i) oxígeno disuelto (OD), (ii) el caudal, (iii) el potencial de oxidación reducción (ORP), (iv) la turbiedad y los sólidos suspendidos totales (SST). El pH y la temperatura mostraron no ser diferentes para tiempo seco o tiempo de lluvia. En cuanto a la espectrometría UV-VIS, los valores de absorbancia registrados en las longitudes de onda borde del rango UV-VIS (200 nm a 750 nm) mostraron explicar en mayor medida las diferencias en variabilidad entre tiempo seco y tiempo de lluvia, que aquellas longitudes de onda más cercanas al centro del rango UV-VIS (e.g. 400 nm). Se aplicó la metodología propuesta a las series de tiempo de parámetros explicativos (del PCA), registradas durante días lluviosos. Se obtuvo una delimitación temporal de diferentes eventos lluviosos, la cual mostró una consistencia apropiada. Utilizando dichos eventos, se implementaron métricas de asociación (Pearson s Correlation Test, Average Mutual Information, Linear Models and Partial Least Squares Model) entre estadísticos de la lluvia y la calidad y cantidad de agua (e.g. media, desviación estándar, entropía y duración), con el fin de detectar relaciones recurrentes entre la lluvia (de tres estaciones) y la calidad y cantidad de agua. Se obtuvieron los siguientes resultados: (i) la turbiedad parece estar relacionada con la duración de los eventos de lluvia y con la altura promedio de agua precipitada, (ii) el OD parece estar asociado con el volumen total de agua precipitada y con la entropía de los pulsos de lluvia, (iii) el caudal se encontró relacionado con la altura media de agua precipitada, la desviación estándar de las alturas de agua precipitadas y con la intensidad promedio del evento, (iv) los SST parecen estar ligados con características de la lluvia tales como la duración del evento, la altura de lluvia promedio, la desviación estándar de las alturas de lluvia y con la memoria de la lluvia (medida a través de la AFC de la lluvia, durante un evento), (v) el ORP parece estar relacionado con características de la lluvia como la duración, el periodo seco antecedente, el volumen total de lluvia, la entropía de la lluvia y la desviación estándar de las alturas de lluvia, durante un evento lluvioso. La lluvia registrada en la estación de Bosa parece estar más correlacionada con características de calidad y cantidad de agua, que para el caso de las estaciones de Fontibón y Casablanca. Preliminarmente, las principales fuentes de variabilidad entre la influencia espacial de una estación de lluvia sobre las diferentes sub cuencas son: (i) la distancia a la estación de lluvia, (ii) la variabilidad de las pendientes de las tuberías, (iii) el porcentaje de uso industrial, (iv) la pendiente promedio de las tuberías y (v) la variabilidad de las longitudes de tuberías.The present study aimed to establish the rainfall influence over the water quality at the effluent of an urban catchment by the use of temporal high resolution measurements. The water quantity and quality parameters at the effluent of the studied urban catchment (Gibraltar: Bogotá, Colombia) were measured by means of infra-red, ion sensitive, UV-VIS spectrometry (water quality) and ultrasonic (water quantity: flow rate) sensors. Hence, the rainfall pulses were registered at three rain gauge stations (Fontibon, Bosa and Casablanca) in proximities of the studied urban catchment. A methodology for identifying periods in which water quality and quantity seemed to be ranging outside of their typical dry weather behavior was designed, by the use of signal processing and time series techniques (Median Filter, Step Detection and Autocorrelation functions-ACF-). A Principal Components Analysis (PCA) brought up a preliminary insight of which water quality and quantity parameters seem to explain mostly the differences between the variability of dry and wet weather data. The most explicative parameters measured by means of uni-parametric and ultrasonic sensors were: (i) the Dissolved Oxygen (DO), (ii) the Flow Rate, (iii) the Oxidation Reduction Potential (ORP), (iv) the Turbidity and (v) the Total Suspended Solids (TSS). The pH and the Temperature seemed not to be altered from dry to rainy weather. For the case of measurements undertaken by UV-VIS spectrometry, the absorbance values registered by the boundary wavelengths of the UV-VIS range (200 nm to 750 nm) seems to explain better the differences between variability of dry and wet weather days data, rather than for the wavelengths closer to the center of the UV-VIS range (for about 400 nm). Hence, considering the most explicative water quality and quantity parameters from PCA, the designed methodology was then applied to the time series registered during rainy days. A temporal delimitation of the rainfall events was obtained, which exposed an appropriate consistency. Regarding the identified rainfall events, association metrics (Pearson s Correlation Test, Average Mutual Information, Linear Models and Partial Least Squares Model) between different rainfall and water quality and quantity statistical characteristics per event (e.g. mean, standard deviation, entropy and duration) were undertaken, with the purpose of detecting recurrent relations between rainfall and water quality and quantity (for rainfall registered at different rain gauge stations). The results showed that: (i) Turbidity seems to be related with the duration of the rainfall pulses and with the mean rainfall height, (ii) DO seems to have an association with total volume of rainfall and with the rainfall pulse s entropy, (iii) flow rate was found to be related with mean rainfall height, standard deviation of rainfall heights and the mean rainfall intensity, (iv) TSS behavior was found to be linked with characteristics of rainfall such as the duration, the mean rainfall height, the standard deviation of the rainfall heights and the memory of the process (evaluated by the ACF of each specific rainfall pulse), (v) ORP seemed to be connected with the characteristics of rainfall events such as the duration, the Antecedent Dry Weather Period (ADWP), the total rainfall volume, the entropy and the standard deviation of the rainfall pulse(s). Rainfall registered at the Bosa rain gauge station seems to be more correlated with characteristics of the effluent in terms of quality and quantity, rather than Fontibon and Casablanca rain gauge stations. Preliminary insights showed that the principal sources of variability between the spatial influence of a given rain gauge station (in terms of the amount of recurrent) with different sub basins of the urban catchment are: (i) the distance to the rain gauge station which is mostly influencing a specific sub basin, (ii) the variability of the pipes slope at a given sub basin, (iii) the industrial land use regarding a sub basin, (iv) the mean pipes slope for a given sub basin and (v) the variability of pipes length at a sub basin.Magíster en HidrosistemasMaestrí

    Monitoring the waste to energy plant using the latest AI methods and tools

    Get PDF
    Solid wastes for instance, municipal and industrial wastes present great environmental concerns and challenges all over the world. This has led to development of innovative waste-to-energy process technologies capable of handling different waste materials in a more sustainable and energy efficient manner. However, like in many other complex industrial process operations, waste-to-energy plants would require sophisticated process monitoring systems in order to realize very high overall plant efficiencies. Conventional data-driven statistical methods which include principal component analysis, partial least squares, multivariable linear regression and so forth, are normally applied in process monitoring. But recently, latest artificial intelligence (AI) methods in particular deep learning algorithms have demostrated remarkable performances in several important areas such as machine vision, natural language processing and pattern recognition. The new AI algorithms have gained increasing attention from the process industrial applications for instance in areas such as predictive product quality control and machine health monitoring. Moreover, the availability of big-data processing tools and cloud computing technologies further support the use of deep learning based algorithms for process monitoring. In this work, a process monitoring scheme based on the state-of-the-art artificial intelligence methods and cloud computing platforms is proposed for a waste-to-energy industrial use case. The monitoring scheme supports use of latest AI methods, laveraging big-data processing tools and taking advantage of available cloud computing platforms. Deep learning algorithms are able to describe non-linear, dynamic and high demensionality systems better than most conventional data-based process monitoring methods. Moreover, deep learning based methods are best suited for big-data analytics unlike traditional statistical machine learning methods which are less efficient. Furthermore, the proposed monitoring scheme emphasizes real-time process monitoring in addition to offline data analysis. To achieve this the monitoring scheme proposes use of big-data analytics software frameworks and tools such as Microsoft Azure stream analytics, Apache storm, Apache Spark, Hadoop and many others. The availability of open source in addition to proprietary cloud computing platforms, AI and big-data software tools, all support the realization of the proposed monitoring scheme

    Multivariate Analysis in Management, Engineering and the Sciences

    Get PDF
    Recently statistical knowledge has become an important requirement and occupies a prominent position in the exercise of various professions. In the real world, the processes have a large volume of data and are naturally multivariate and as such, require a proper treatment. For these conditions it is difficult or practically impossible to use methods of univariate statistics. The wide application of multivariate techniques and the need to spread them more fully in the academic and the business justify the creation of this book. The objective is to demonstrate interdisciplinary applications to identify patterns, trends, association sand dependencies, in the areas of Management, Engineering and Sciences. The book is addressed to both practicing professionals and researchers in the field

    Business analytics in industry 4.0: a systematic review

    Get PDF
    Recently, the term “Industry 4.0” has emerged to characterize several Information Technology and Communication (ICT) adoptions in production processes (e.g., Internet-of-Things, implementation of digital production support information technologies). Business Analytics is often used within the Industry 4.0, thus incorporating its data intelligence (e.g., statistical analysis, predictive modelling, optimization) expert system component. In this paper, we perform a Systematic Literature Review (SLR) on the usage of Business Analytics within the Industry 4.0 concept, covering a selection of 169 papers obtained from six major scientific publication sources from 2010 to March 2020. The selected papers were first classified in three major types, namely, Practical Application, Reviews and Framework Proposal. Then, we analysed with more detail the practical application studies which were further divided into three main categories of the Gartner analytical maturity model, Descriptive Analytics, Predictive Analytics and Prescriptive Analytics. In particular, we characterized the distinct analytics studies in terms of the industry application and data context used, impact (in terms of their Technology Readiness Level) and selected data modelling method. Our SLR analysis provides a mapping of how data-based Industry 4.0 expert systems are currently used, disclosing also research gaps and future research opportunities.The work of P. Cortez was supported by FCT - Fundação para a Ciência e Tecnologia within the R&D Units Project Scope: UIDB/00319/2020. We would like to thank to the three anonymous reviewers for their helpful suggestions
    corecore