57 research outputs found

    Algorithms for enhancing pattern separability, feature selection and incremental learning with applications to gas sensing electronic nose systems

    Get PDF
    Three major issues in pattern recognition and data analysis have been addressed in this study and applied to the problem of identification of volatile organic compounds (VOC) for gas sensing applications. Various approaches have been proposed and discussed. These approaches are not only applicable to the VOC identification, but also to a variety of pattern recognition and data analysis problems. In particular, (1) enhancing pattern separability for challenging classification problems, (2) optimum feature selection problem, and (3) incremental learning for neural networks have been investigated;Three different approaches are proposed for enhancing pattern separability for classification of closely spaced, or possibly overlapping clusters. In the neurofuzzy approach, a fuzzy inference system that considers the dynamic ranges of individual features is developed. Feature range stretching (FRS) is introduced as an alternative approach for increasing intercluster distances by mapping the tight dynamic range of each feature to a wider range through a nonlinear function. Finally, a third approach, nonlinear cluster transformation (NCT), is proposed, which increases intercluster distances while preserving intracluster distances. It is shown that NCT achieves comparable, or better, performance than the other two methods at a fraction of the computational burden. The implementation issues and relative advantages and disadvantages of these approaches are systematically investigated;Selection of optimum features is addressed using both a decision tree based approach, and a wrapper approach. The hill-climb search based wrapper approach is applied for selection of the optimum features for gas sensing problems;Finally, a new method, Learn++, is proposed that gives classification algorithms, the capability of incrementally learning from new data. Learn++ is introduced for incremental learning of new data, when the original database is no longer available. Learn++ algorithm is based on strategically combining an ensemble of classifiers, each of which is trained to learn only a small portion of the pattern space. Furthermore, Learn++ is capable of learning new data even when new classes are introduced, and it also features a built-in mechanism for estimating the reliability of its classification decision;All proposed methods are explained in detail and simulation results are discussed along with directions for future work

    Data mining using intelligent systems : an optimized weighted fuzzy decision tree approach

    Get PDF
    Data mining can be said to have the aim to analyze the observational datasets to find relationships and to present the data in ways that are both understandable and useful. In this thesis, some existing intelligent systems techniques such as Self-Organizing Map, Fuzzy C-means and decision tree are used to analyze several datasets. The techniques are used to provide flexible information processing capability for handling real-life situations. This thesis is concerned with the design, implementation, testing and application of these techniques to those datasets. The thesis also introduces a hybrid intelligent systems technique: Optimized Weighted Fuzzy Decision Tree (OWFDT) with the aim of improving Fuzzy Decision Trees (FDT) and solving practical problems. This thesis first proposes an optimized weighted fuzzy decision tree, incorporating the introduction of Fuzzy C-Means to fuzzify the input instances but keeping the expected labels crisp. This leads to a different output layer activation function and weight connection in the neural network (NN) structure obtained by mapping the FDT to the NN. A momentum term was also introduced into the learning process to train the weight connections to avoid oscillation or divergence. A new reasoning mechanism has been also proposed to combine the constructed tree with those weights which had been optimized in the learning process. This thesis also makes a comparison between the OWFDT and two benchmark algorithms, Fuzzy ID3 and weighted FDT. SIx datasets ranging from material science to medical and civil engineering were introduced as case study applications. These datasets involve classification of composite material failure mechanism, classification of electrocorticography (ECoG)/Electroencephalogram (EEG) signals, eye bacteria prediction and wave overtopping prediction. Different intelligent systems techniques were used to cluster the patterns and predict the classes although OWFDT was used to design classifiers for all the datasets. In the material dataset, Self-Organizing Map and Fuzzy C-Means were used to cluster the acoustic event signals and classify those events to different failure mechanism, after the classification, OWFDT was introduced to design a classifier in an attempt to classify acoustic event signals. For the eye bacteria dataset, we use the bagging technique to improve the classification accuracy of Multilayer Perceptrons and Decision Trees. Bootstrap aggregating (bagging) to Decision Tree also helped to select those most important sensors (features) so that the dimension of the data could be reduced. Those features which were most important were used to grow the OWFDT and the curse of dimensionality problem could be solved using this approach. The last dataset, which is concerned with wave overtopping, was used to benchmark OWFDT with some other Intelligent Systems techniques, such as Adaptive Neuro-Fuzzy Inference System (ANFIS), Evolving Fuzzy Neural Network (EFuNN), Genetic Neural Mathematical Method (GNMM) and Fuzzy ARTMAP. Through analyzing these datasets using these Intelligent Systems Techniques, it has been shown that patterns and classes can be found or can be classified through combining those techniques together. OWFDT has also demonstrated its efficiency and effectiveness as compared with a conventional fuzzy Decision Tree and weighted fuzzy Decision Tree

    Environmental engineering applications of electronic nose systems based on MOX gas sensors

    Get PDF
    Nowadays, the electronic nose (e-nose) has gained a huge amount of attention due to its ability to detect and differentiate mixtures of various gases and odors using a limited number of sensors. Its applications in the environmental fields include analysis of the parameters for environmental control, process control, and confirming the efficiency of the odor-control systems. The e-nose has been developed by mimicking the olfactory system of mammals. This paper investigates e-noses and their sensors for the detection of environmental contaminants. Among different types of gas chemical sensors, metal oxide semiconductor sensors (MOXs) can be used for the detection of volatile compounds in air at ppm and sub-ppm levels. In this regard, the advantages and disadvantages of MOX sensors and the solutions to solve the problems arising upon these sensors’ applications are addressed, and the research works in the field of environmental contamination monitoring are overviewed. These studies have revealed the suitability of e-noses for most of the reported applications, especially when the tools were specifically developed for that application, e.g., in the facilities of water and wastewater management systems. As a general rule, the literature review discusses the aspects related to various applications as well as the development of effective solutions. However, the main limitation in the expansion of the use of e-noses as an environmental monitoring tool is their complexity and lack of specific standards, which can be corrected through appropriate data processing methods applications

    Data mining using intelligent systems : an optimized weighted fuzzy decision tree approach

    Get PDF
    Data mining can be said to have the aim to analyze the observational datasets to find relationships and to present the data in ways that are both understandable and useful. In this thesis, some existing intelligent systems techniques such as Self-Organizing Map, Fuzzy C-means and decision tree are used to analyze several datasets. The techniques are used to provide flexible information processing capability for handling real-life situations. This thesis is concerned with the design, implementation, testing and application of these techniques to those datasets. The thesis also introduces a hybrid intelligent systems technique: Optimized Weighted Fuzzy Decision Tree (OWFDT) with the aim of improving Fuzzy Decision Trees (FDT) and solving practical problems. This thesis first proposes an optimized weighted fuzzy decision tree, incorporating the introduction of Fuzzy C-Means to fuzzify the input instances but keeping the expected labels crisp. This leads to a different output layer activation function and weight connection in the neural network (NN) structure obtained by mapping the FDT to the NN. A momentum term was also introduced into the learning process to train the weight connections to avoid oscillation or divergence. A new reasoning mechanism has been also proposed to combine the constructed tree with those weights which had been optimized in the learning process. This thesis also makes a comparison between the OWFDT and two benchmark algorithms, Fuzzy ID3 and weighted FDT. SIx datasets ranging from material science to medical and civil engineering were introduced as case study applications. These datasets involve classification of composite material failure mechanism, classification of electrocorticography (ECoG)/Electroencephalogram (EEG) signals, eye bacteria prediction and wave overtopping prediction. Different intelligent systems techniques were used to cluster the patterns and predict the classes although OWFDT was used to design classifiers for all the datasets. In the material dataset, Self-Organizing Map and Fuzzy C-Means were used to cluster the acoustic event signals and classify those events to different failure mechanism, after the classification, OWFDT was introduced to design a classifier in an attempt to classify acoustic event signals. For the eye bacteria dataset, we use the bagging technique to improve the classification accuracy of Multilayer Perceptrons and Decision Trees. Bootstrap aggregating (bagging) to Decision Tree also helped to select those most important sensors (features) so that the dimension of the data could be reduced. Those features which were most important were used to grow the OWFDT and the curse of dimensionality problem could be solved using this approach. The last dataset, which is concerned with wave overtopping, was used to benchmark OWFDT with some other Intelligent Systems techniques, such as Adaptive Neuro-Fuzzy Inference System (ANFIS), Evolving Fuzzy Neural Network (EFuNN), Genetic Neural Mathematical Method (GNMM) and Fuzzy ARTMAP. Through analyzing these datasets using these Intelligent Systems Techniques, it has been shown that patterns and classes can be found or can be classified through combining those techniques together. OWFDT has also demonstrated its efficiency and effectiveness as compared with a conventional fuzzy Decision Tree and weighted fuzzy Decision Tree.EThOS - Electronic Theses Online ServiceUniversity of WarwickOverseas Research Students Awards Scheme (ORSAS)GBUnited Kingdo

    Improving building occupant comfort through a digital twin approach:A Bayesian network model and predictive maintenance method

    Get PDF
    This study introduces a Bayesian network model to evaluate the comfort levels of occupants of two non-residential Norwegian buildings based on data collected from satisfaction surveys and building performance parameters. A Digital Twin approach is proposed to integrate building information modeling (BIM) with real-time sensor data, occupant feedback, and a probabilistic model of occupant comfort to detect and predict HVAC issues that may impact comfort. The study also uses 200000 points as historical data of various sensors to understand the previous building systems’ behavior. The study also presents new methods for using BIM as a visualization platform and for predictive maintenance to identify and address problems in the HVAC system. For predictive maintenance, nine machine learning algorithms were evaluated using metrics such as ROC, accuracy, F1-score, precision, and recall, where Extreme Gradient Boosting (XGB) was the best algorithm for prediction. XGB is on average 2.5% more accurate than Multi-Layer Perceptron (MLP), and up to 5% more accurate than the other models. Random Forest is around 96% faster than XGBoost while being relatively easier to implement. The paper introduces a novel method that utilizes several standards to determine the remaining useful life of HVAC, leading to a potential increase in its lifetime by at least 10% and resulting in significant cost savings. The result shows that the most important factors that affect occupant comfort are poor air quality, lack of natural light, and uncomfortable temperature. To address the challenge of applying these methods to a wide range of buildings, the study proposes a framework using ontology graphs to integrate data from different systems, including FM, CMMS, BMS, and BIM. This study’s results provide insight into the factors that influence occupant comfort, help to expedite identifying equipment malfunctions and point towards potential solutions, leading to more sustainable and energy-efficient buildings.publishedVersio

    Methods for increasing the sensing performance of metal oxide semiconductor gas sensors at ppb concentration levels

    Get PDF
    Metal oxide semiconductor gas sensors are in general well suited for high volume gas sensing applications, e.g. air quality monitoring, due to their low cost and high sensitivity. However, in many applications, the gases to be detected can occur in very low concentrations, which complicates selective measurement of specific components. In this thesis, several methods are presented which improve the performance of such sensors for the detection of gases at trace concentrations. First, the design and characterization of a gas mixing system is described, which allows generation of test gases in the ppb (parts per billion) concentration range. Several well established techniques are then tested for their applicability at these low concentration levels. Key elements are cyclic modulation of the sensor temperature and signal processing based on methods for pattern recognition, for both single sensors and combined sensor signals. A novel development is an integrated micro system in which gas pre-concentration is realized in combination with the sensor. In addition to the technical developments, the reproducibility of the results has been investigated in an inter-laboratory comparison, where a measurement system for benzene has been characterized in two different setups for test gas generation. The presented methods provide a basis for using low-cost metal oxide semiconductor gas sensors for potential applications in the field of trace gas analysis.Metalloxid-Halbleiter-Gassensoren (MOS) sind aufgrund ihrer hohen Sensitivität und geringen Preises grundsätzlich gut geeignet für Gasdetektion in Anwendungen mit hohen Stückzahlen, zum Beispiel für die Überwachung von Luftqualität. In diesen können die zu detektierenden Gase jedoch in sehr niedrigen Konzentrationen auftreten, was eine gezielte Messung einzelner Komponenten erschwert. In dieser Arbeit werden Verfahren vorgestellt, mit denen die Leistung solcher Sensoren für die Detektion von Gasen in Spuren-Konzentrationen verbessert wird. Zunächst werden das Design und die Charakterisierung einer Gasmischanlage beschrieben, die eine zuverlässige Generierung von Testgasen im ppb-Bereich (parts per billion) ermöglicht. Mehrere etablierte Verfahren werden dann auf ihre Eignung für diesen niedrigen Konzentrationsbereich getestet. Zentrale Elemente sind hierbei eine zyklische Änderung der Sensortemperatur und Signalverarbeitung basierend auf Methoden zu Mustererkennung, sowohl für einzelne Sensoren als auch für kombinierte Signale. Eine neue Entwicklung ist ein integriertes Mikrosystem, in dem zusätzlich zum Sensor eine Gas-Aufkonzentration realisiert ist. Neben den technischen Entwicklungen wurde die Reproduzierbarkeit der Ergebnisse in einer Vergleichsmessung in zwei Labors untersucht; hier wurde ein Messsystem für Benzol in zwei unterschiedlichen Setups zur Gasaufgabe getestet. Die vorgestellten Methoden bieten eine Grundlage zum Einsatz der günstigen MOS-Gassensoren für Anwendungen im Bereich von Spurengasen

    Improvement of ms based e-nose performances by incorporation of chromatographic retention time as a new data dimension

    Get PDF
    Mejora del rendimiento de la nariz electrónica basada en espectrometría de masas mediante la incorporación del tiempo de retención cromatografico como una nueva dimensión de datosLa importancia del sentido de olor en la naturaleza y en la sociedad humana queda latente con el gran interés que se muestra en el análisis del olor y el gusto en la industria alimentaria. Aunque las aéreas mas interesadas son las de la alimentación y bebida, también se ha mostrado la necesitad para esta tecnología en otros campos como en el de la cosmética. Lamentablemente, el uso de los paneles sensoriales humanos o paneles caninos son costosos, propensos al cansancio, subjetivos, poco fiables e inadecuados para cuantificar, mientras que el análisis de laboratorio, a pesar de la precisión, imparcialidad y capacidad cuantitativa, necesita una labor intensa, con personal especializado y requiere de mucho tiempo. Debido a estos inconvenientes el concepto de olfato artificial generó un gran interés en entornos industriales.El término "nariz electrónica" se asocia con una serie de sensores de gases químicos, con una amplia superposición de selectividad para las mediciones de compuestos volátiles en combinación con los instrumentos informáticos de análisis de datos. La nariz electrónica se utiliza para proporcionar una información comparativa en vez de una cualitativa en un análisis, y porque la interpretación puede ser automatizada, el dispositivo es adecuado para el control de calidad y análisis. A pesar de algunos logros prometedores, los sensores de estado sólido de gas no han cumplido con sus expectativas. La baja sensibilidad y selectividad, la corta vida del sensor, la calibración difícil y los problemas de deriva han demostrado serias limitaciones. En un esfuerzo para mejorar los inconvenientes de los sensores de estado sólido, se han adoptado nuevos enfoques, utilizando diferentes sensores para la nariz electrónica. Sistemas de sensores ópticos, la espectrometría de movilidad iónica y la espectrometría infrarroja son ejemplos de técnicas que han sido probadas.Las narices electrónicas basadas en la espectrometría de masas (MS) aparecieron por primera vez en 1998 [B. Dittmann, S. y G. Nitz Horner. Adv. Food Sci. 20 (1998), p. 115], y representan un salto importante en la sensibilidad, retando a la nariz electrónica basada en sensores químicos. Este nuevo enfoque del concepto de una nariz electrónica usa sensores virtuales en forma de proporciones m/z. Una huella digital compleja y muy reproducible se obtiene en forma de un espectro de masas, que se procesa mediante algoritmos de reconocimiento de patrones para la clasificación y cuantificación. A pesar de que la nariz electrónica basada en la espectrometría de masas supera a la nariz electrónica clásica de sensores de estado sólido en muchos aspectos, su uso se limita actualmente a la instrumentación de laboratorio de escritorio. La falta de portabilidad no representará necesariamente un problema en el futuro, dado que espectrómetros de masas en miniatura se han fabricado ya en una fase de prototipado.Un inconveniente más crítico de la nariz electrónica basada en MS consiste en la manera en la que se analizan las muestras. La fragmentación simultánea de mezclas complejas de isómeros pueden producir resultados muy similares a raíz de este enfoque. Una nariz electrónica mejor sería la que combina la sensibilidad y el poder de identificación del detector de masas con la capacidad de separación de la cromatografía de gases. El principal inconveniente de este enfoque es de nuevo el coste y la falta de portabilidad de los equipos. Además de los problemas anteriores con la espectrometría de masas, el análisis de cromatografía de gases requiere mucho tiempo de medida.Para abordar estas cuestiones, se han reportado miniaturizaciones en cromatografía capilar de gases (GC) que hacen posible el GC-en-un-chip, CG-rápido y CG-flash que hacen uso de columnas cortas, reduciendo el tiempo de análisis a los tiempos de elución como segundos y, en algunos casos, se han comercializado. La miniaturización de la espectrometría de masas y cromatografía de gases tiene un gran potencial para mejorar el rendimiento, la utilidad y la accesibilidad de la nueva generación de narices electrónicas.Esta tesis se dedica al estudio y a la evaluación del enfoque del GC-MS para la nariz electrónica como un paso anterior al desarrollo de las tecnologías mencionadas anteriormente. El objetivo principal de la tesis es de estudiar si el tiempo de retención de una separación de cromatografía puede mejorar el rendimiento de la nariz electrónica basada en MS, mostrando que la adición de una tercera dimensión trae más información, ayudando a la clasificación de las pruebas. Esto se puede hacer de dos maneras: · comparando el análisis de datos de dos vías de espectrometría de masas con análisis de datos de dos vías de matrices desplegadas y concatenadas para los datos de tres vías y · comparando el análisis de datos de dos vías del espectrometría de masas con el análisis de datos de tres vías para el conjunto de datos tridimensionales.Desde el punto de vista de cromatografía, la meta será la de optimizar el método cromatográfico con el fin de reducir el tiempo de análisis a un mínimo sin dejar de tener resultados aceptables.Un paso importante en el análisis de datos multivariados de vías múltiples es el preprocesamiento de datos. Debido a este objetivo, el último objetivo será el de determinar qué técnicas de preprocesamiento son las mejores para y el análisis de dos y tres vías de datos.Con el fin de alcanzar los objetivos propuestos se crearon dos grupos de datos. El primero consiste en las mezclas de nueve isómeros de dimetilfenol y etilfenol. La razón de esta elección fue la similitud de los espectros de masas entre sí. De esta manera la nariz electrónica basada en espectrometría de masas sería retada por el conjunto de datos. También teniendo en cuenta el tiempo de retención de los nueve isómeros solos, las soluciones se hicieron, como si el conjunto de datos demostraría el reto si se usaría sólo el tiempo de retención. Por tanto, este conjunto de datos "artificiales" sostiene nuestras esperanzas en mostrar las mejoras de la utilización de ambas dimensiones, la MS (espectros de masas) y la GC (tiempo de retención).Veinte clases, representando las soluciones de los nueve isómeros se midieron en diez repeticiones cada una, por tres métodos cromatográficos, dando un total de 600 mediciones. Los métodos cromatográficos fueron diseñados para dar un cromatograma resuelto por completo, un pico coeluido y una situación intermediaria con un cromatograma resuelto parcialmente. Los datos fueron registrados en una matriz de tres dimensiones con las siguientes direcciones: (muestras medidas) x (proporción m/z) x (tiempo de retención). Por "colapsar" los ejes X e Y del tiempo de retención cromatográfica y los fragmentos m/z, respectivamente, se obtuvieron dos matrices que representan los espectros de masa regular y el cromatograma de iones totales, respectivamente. Estos enfoques sueltan la información traída por la tercera dimensión y el despliegue por lo que la matriz original 3D y la concatenación de las TIC y el espectro de masa media se han tenido en consideración como una forma de preservar la información adicional de la tercera dimensión en una matriz de dos dimensiones.Los datos fueron tratados mediante la alineación de picos, con una media de centrado y la normalización por la altura máxima y el área del pico, los instrumentos de pre-procesamiento que también fueron evaluados por sus logros.Para el análisis de datos de dos vías fueron utilizados el PCA, PLS-DA y fuzzyARTMAP. La agrupación de PCA y PARAFAC fueron evaluados por la relación intervariedad - intravariedad, mientras que los resultados mediante fuzzy ARTMAP fueron dados como el éxito de la las tasas de clasificación en porcentajes.Cuando PCA y PARAFAC se utilizaron, como era de esperar, el método de cromatografía resuelto (método 1) dio los mejores resultados globales, donde los algoritmos 2D funcionan mejor, mientras que en un caso más complicado (picos más coeluidos del método 3) pierden eficacia frente a métodos 3D.En el caso de PLS-DA y n-PLS, aunque los resultados no son tan concluyentes como los resultados del PCA y PARAFAC, tratándose de las diferencias mínimas, el modelo de vías múltiples PLS-DA ofrece un porcentaje de éxito en la predicción de ambos conjuntos de datos. También se recomienda el n-PLS en vez de utilizar datos desplegados y concatenados, ya que construye un modelo más parsimonioso.Para el análisis fuzzyARTMAP, la estrategia de votación empleada ha demostrado que al usar los espectros de masa media y la información del cromatograma de iones totales juntos se obtienen resultados más consistentes.En el segundo conjunto de datos se aborda el problema de la adulteración del aceite de oliva extra virgen con aceite de avellana, que debido a las similitudes entre los dos aceites es una de las más difíciles de detectar. Cuatro aceites extra virgen de oliva y dos aceites de avellana se midieron puros y en mezclas de 30%, 10%, 5% y 2% con los mismos objetivos mostrando que la adición de la extra dimensión mejora los resultados. Se han hechos cinco repeticiones para cada preparación, dando un total de 190 muestras: 4 aceites puros de oliva, 2 aceites puros de avellana y 32 adulteraciones de aceite de avellana en aceite de oliva, dando un total de 38 clases. Dos métodos cromatográficos fueron utilizados. El primero estaba dirigido a una completa separación de los componentes del aceite de oliva y empleó una separación con temperatura programable, mientras que el objetivo del segundo método fue un pico coeluido, por lo tanto fue contratada una temperatura constante de separación. Los datos fueron analizados por medio de la PCA, PARAFAC, PLS-DA y PLS-n.Como en el conjunto "artificial" de datos, el PCA y PARAFAC se analizaron por medio de la capacidad de clusterización, que mostró que los mejores resultados se obtienen con los datos desplegados seguido por los datos 3D tratados con el PARAFAC.Desde el punto de vista de optimización de la columna, los logros obtenidos por la columna corta está por debajo del enfoque de la columna larga, pero este caso demuestra una vez más que la adición de los incrementos de tercera dimensión mejoran la nariz electrónica basada en MS.Para el PLS-DA y n-PLS se evaluaron las tasas de éxito comparativamente, tanto para las corridas cromatográficas largas como para las cortas. Mientras que para la columna larga el mejor rendimiento es para los datos del cromatograma de iones totales (TIC), la columna corta muestra mejor rendimiento para los datos concatenados de los espectros de masa media y TIC. Además, la predicción de las tasas de éxito son las mismas para los datos TIC de columna larga como para los datos concatenados de la columna corta. Este caso es muy interesante porque demuestra que el enfoque PLS de la tercera dimensión mejora los resultados y, por otra parte, mediante el uso de la columna corta el tiempo de análisis se acorta considerablemente.Se esperan ciertos logros de la nariz electrónica. Por el momento, ninguno de esos enfoques se acercó lo suficiente para producir una respuesta positiva en los mercados. Los sensores de estado sólido tienen inconvenientes casi imposibles de superar. La nariz electrónica basada en espectrometría de masas tiene una falta de portabilidad y a veces sus logros son insuficientes, y el aparato del cromatógrafo de gases-espectrómetro de masas sufre problemas de portabilidad igual que espectrómetro de masas y toma mucho tiempo. El desarrollo de potentes algoritmos matemáticos durante los últimos años, junto con los avances en la miniaturización, tanto para MS y GC y mostrar cromatografía rápida cierta esperanza de una nariz electrónica mucho mejor.A través de este trabajo podemos afirmar que la adición del tiempo de retención cromatográfica como una dimensión extra aporta una ventaja sobre las actuales tecnologías de la nariz electrónica. Mientras que para los cromatogramas totalmente resueltos no se logran mejoras o la ganancia es mínima, sobre todo en la predicción, para una columna corta la información adicional mejora los resultados, en algunos casos, hacerlos tan bien como cuando una larga columna se utiliza. Esto es muy importante ya que las mediciones en un cromatógrafo de gases - espectrometro de masas se pueden optimizar para tramos muy cortos, una característica muy importante para una nariz electrónica. Esto permitiría el diseño de un instrumento de mayor rendimiento, adecuado para el control de calidad en líneas de productos

    Advanced solutions for the abatement of VOCs and odours

    Get PDF
    2017 - 2018In the last decades, atmospheric pollution has become an increasingly alarming problem, due to its adverse effects at the global, regional and local scales. In this context, the emissions of Greenhouse Gases (GHGs), Volatile Organic Compounds (VOCs) and odours from chemical manufacturing plants, petrochemical sector and other hazardous sources pose a major challenge. Global warming, due to increased GHGs level in the atmosphere, has been identified as one of the key challenges in this century. Indeed, the impacts of global warming have caused severe damages towards human and environment ecosystem. VOCs are included among the priority gaseous organic contaminants, with BTEX identified among the most dangerous for human health. They are also considered responsible for the photochemical pollution as a result of their reaction in the atmosphere with nitrogen oxides in presence of solar radiation. In addition, their tendency to volatilize readily to the atmosphere leads to problems connected to odour annoyance. These aspects triggered the enforcement of stricter regulations and, consequently, boosted the necessity of properly manage atmospheric emissions. The conventional chemical-physical processes mainly used for the treatment of these kinds of emissions envisage the contaminants transfer to other phases and, thus, the necessity of further treatments. Biological processes and Advanced Oxidation Processes (AOPs), instead, are able to support the degradation and mineralization of organic compounds, resulting in more effective solutions. Furthermore, AOPs applied as pretreatments at biological processes may improve VOCs biotreatability and control the accumulation of biomass. Moreover, since the biological treatment of high concentrations of VOCs might cause a limitation of the oxygen available for the aerobic degradation due to the reduced water-solubility of this compound, the synergic activity of microalgae and bacteria represents an efficient alternative to support the simultaneous abatement of CO2 and VOCs. In algal-bacterial photo-bioreactors, microalgae produce oxygen during the photosynthetic process in the presence of light and CO2, while heterotrophic bacteria utilize the additional O2 supply to accelerate the oxidation of organic compounds. In turn, the CO2 resulting from the mineralization process is fixed by the microalgae. Mechanisms underlying microalgae activity might not only prevent oxygen limitation but also enhance the biodegradability of the target VOC. In this context is framed the research activity discussed in the present work, aimed to: the comparative evaluation of UV-assisted ozonation and its combination with conventional processes in different operating conditions; the comparative evaluation of two different biological reactors and the assessment of their continuous toluene degradation performances under different operating conditions; the scale-up of the proposed systems and the assessment of the technical feasibility. To this end, experimental activity was structured in two main steps: the first one was focused on the assessment of ozone and photolysis effectiveness in promoting toluene degradation; the second part was focused on the assessment of enhanced biological processes for the continuous removal of gaseous toluene. The first part of the research, focused on the comparative assessment of different configuration of AOPs systems, was performed at the Sanitary Environmental Engineering division (SEED) of Salerno University. Toluene was identified as target compound for the experimental activities. A lab-scale UV/O3 reactor was investigated for the degradation of VOCs emissions under different operating conditions, in order to highlight the influence of the inlet concentrations and the ozone dosages. A novel configuration with an additional scrubbing phase is proposed and assessed to improve the removal efficiency and to prevent the release of polluting intermediates of the single-step process. The combined system boosted higher performance and stability compared to the stand-alone (UV/O3) process along with a more economical and environmental sustainability. In the second phase, the experimental activity was performed at the Department of Chemical Engineering and Environmental Technology of Valladolid University. The experimental activity aimed at evaluating and systematically comparing the continuous toluene degradation performance of the proposed biological reactors, a conventional bacterial Biotrickilng Filter (BTF) and an innovative Tubular Photo-BioReactor (TPBR). Different operating conditions have been investigated, varying the Empty Bed Residence Time (EBRT) and the toluene inlet concentration to gradually increase the Inlet Load (IL) entering the systems. Toluene mass transfer tests have been carried out in order to determine the limiting stage, and a final robustness test performed to assess the capacity of the systems to face inlet load fluctuations. The results obtained demonstrated the potential of the synergic effects between bacteria and microalgae. The higher DO concentrations ensured oxygen availability for the microbial community and improved the process performances. The carbon dioxide released from mineralization process was utilized for the valuable biomass production. Conventional processes with AOPs pretreatment and microalgae-bacteria consortium inoculation thus represent innovative and promising methods for the increase of treatment efficiencies, biomass valorization and GHGs reduction. The combination of conventional and advanced processes represents a sustainable platform to reduce the emission of undesirable byproducts, besides treating high concentrations of VOC. [edited by Author]XXXI cicl

    Preprints / 2nd IFAC Workshop on Computer Software Structures Integrating AI/KBS Systems in Process Control, August 10-12, 1994, Lund, Sweden

    Get PDF
    corecore