7,824 research outputs found

    ADAPTS: An Intelligent Sustainable Conceptual Framework for Engineering Projects

    Get PDF
    This paper presents a conceptual framework for the optimization of environmental sustainability in engineering projects, both for products and industrial facilities or processes. The main objective of this work is to propose a conceptual framework to help researchers to approach optimization under the criteria of sustainability of engineering projects, making use of current Machine Learning techniques. For the development of this conceptual framework, a bibliographic search has been carried out on the Web of Science. From the selected documents and through a hermeneutic procedure the texts have been analyzed and the conceptual framework has been carried out. A graphic representation pyramid shape is shown to clearly define the variables of the proposed conceptual framework and their relationships. The conceptual framework consists of 5 dimensions; its acronym is ADAPTS. In the base are: (1) the Application to which it is intended, (2) the available DAta, (3) the APproach under which it is operated, and (4) the machine learning Tool used. At the top of the pyramid, (5) the necessary Sensing. A study case is proposed to show its applicability. This work is part of a broader line of research, in terms of optimization under sustainability criteria.Telefónica Chair “Intelligence in Networks” of the University of Seville (Spain

    Ensemble model-based method for time series sensors’ data validation and imputation applied to a real waste water treatment plant

    Get PDF
    Intelligent Decision Support Systems (IDSSs) integrate different Artificial Intelligence (AI) techniques with the aim of taking or supporting human-like decisions. To this end, these techniques are based on the available data from the target process. This implies that invalid or missing data could trigger incorrect decisions and therefore, undesirable situations in the supervised process. This is even more important in environmental systems, which incorrect malfunction could jeopardise related ecosystems. In data-driven applications such as IDSS, data quality is a basal problem that should be addressed for the sake of the overall systems’ performance. In this paper, a data validation and imputation methodology for time-series is presented. This methodology is integrated in an IDSS software tool which generates suitable control set-points to control the process. The data validation and imputation approach presented here is focused on the imputation step, and it is based on an ensemble of different prediction models obtained for the sensors involved in the process. A Case-Based Reasoning (CBR) approach is used for data imputation, i.e., similar past situations to the current one can propose new values for the missing ones. The CBR model is complemented with other prediction models such as Auto Regressive (AR) models or Artificial Neural Network (ANN) models. Then, the different obtained predictions are ensembled to obtain a better prediction performance than the obtained by each individual prediction model separately. Furthermore, the use of a meta-prediction model, trained using the predictions of all individual models as inputs, is proposed and compared with other ensemble methods to validate its performance. Finally, this approach is illustrated in a real Waste Water Treatment Plant (WWTP) case study using one of the most relevant measures for the correct operation of the WWTPs IDSS, i.e., the ammonia sensor, and considering real faults, showing promising results with improved performance when using the ensemble approach presented here compared against the prediction obtained by each individual model separately.The authors acknowledge the partial support of this work by the Industrial Doctorate Programme (2017DI-006) and the Research Consolidated Groups/Centres Grant (2017 SGR 574) from the Catalan Agency of University and Research Grants Management (AGAUR), from Catalan Government.Peer ReviewedPostprint (published version

    On the role of pre and post-processing in environmental data mining

    Get PDF
    The quality of discovered knowledge is highly depending on data quality. Unfortunately real data use to contain noise, uncertainty, errors, redundancies or even irrelevant information. The more complex is the reality to be analyzed, the higher the risk of getting low quality data. Knowledge Discovery from Databases (KDD) offers a global framework to prepare data in the right form to perform correct analyses. On the other hand, the quality of decisions taken upon KDD results, depend not only on the quality of the results themselves, but on the capacity of the system to communicate those results in an understandable form. Environmental systems are particularly complex and environmental users particularly require clarity in their results. In this paper some details about how this can be achieved are provided. The role of the pre and post processing in the whole process of Knowledge Discovery in environmental systems is discussed

    Predicting water quality and ecological responses

    Get PDF
    Abstract Changes to climate are predicted to have effects on freshwater streams. Stream flows are likely to change, with implications for freshwater ecosystems and water quality. Other stressors such as population growth, community preferences and management policies can be expected to interact in various ways with climate change and stream flows, and outcomes for freshwater ecosystems and water quality are uncertain. Managers of freshwater ecosystems and water supplies could benefit from being able to predict the scales of likely changes. This project has developed and applied a linked modelling framework to assess climate change impacts on water quality regimes and ecological responses. The framework is designed to inform water planning and climate adaptation activities. It integrates quantitative tools, and predicts relationships between future climate, human activities, water quality and ecology, thereby filling a gap left by the considerable research effort so far invested in predicting stream flows. The modelling framework allows managers to explore potential changes in the water quality and ecology of freshwater systems in response to plausible scenarios for climate change and management adaptations. Although set up for the Upper Murrumbidgee River catchment in southern NSW and ACT, the framework was planned to be transferable to other regions where suitable data are available. The approach and learning from the project appear to have the potential to be broadly applicable. We selected six climate scenarios representing minor, moderate and major changes in flow characteristics for 1oC and 2oC temperature increases. These were combined with four plausible alternative management adaptations that might be used to modify water supply, urban water demand and stream flow regimes in the Upper Murrumbidgee catchment. The Bayesian Network (BN) model structure we used was developed using both a ‘top down’ and ‘bottom up’ approach. From analyses combined with expert advice, we identified the causal structure linking climate variables to stream flow, water quality attributes, land management and ecological responses (top down). The ‘bottom up’ approach focused on key ecological outcomes and key drivers, and helped produce efficient models. The result was six models for macroinvertebrates, and one for fish. In the macroinvertebrate BN models, nodes were discretised using statistical/empirical derived thresholds using new techniques. The framework made it possible to explore how ecological communities respond to changes in climate and management activities. Particularly, we focused on the effects of water quality and quantity on ecological responses. The models showed a strong regional response reflecting differences across 18 regions in the catchment. In two regions the management alternatives were predicted to have stronger effects than climate change. In three other regions the predicted response to climate change was stronger. Analyses of water quality suggested minor changes in the probability of water quality exceeding thresholds designed to protect aquatic ecosystems. The ‘bottom up’ approach limited the framework’s transferability by being specific to the Upper Murrumbidgee catchment data. Indeed, to meet stakeholder questions models need to be specifically tailored. Therefore the report proposes a general model-building framework for transferring the approach, rather than the models, to other regions.  Please cite this report as: Dyer, F, El Sawah, S, Lucena-Moya, P, Harrison, E, Croke, B, Tschierschke, A, Griffiths, R, Brawata, R, Kath, J, Reynoldson, T, Jakeman, T 2013 Predicting water quality and ecological responses, National Climate Change Adaptation Research Facility, Gold Coast, pp. 110 Changes to climate are predicted to have effects on freshwater streams. Stream flows are likely to change, with implications for freshwater ecosystems and water quality. Other stressors such as population growth, community preferences and management policies can be expected to interact in various ways with climate change and stream flows, and outcomes for freshwater ecosystems and water quality are uncertain. Managers of freshwater ecosystems and water supplies could benefit from being able to predict the scales of likely changes. This project has developed and applied a linked modelling framework to assess climate change impacts on water quality regimes and ecological responses. The framework is designed to inform water planning and climate adaptation activities. It integrates quantitative tools, and predicts relationships between future climate, human activities, water quality and ecology, thereby filling a gap left by the considerable research effort so far invested in predicting stream flows. The modelling framework allows managers to explore potential changes in the water quality and ecology of freshwater systems in response to plausible scenarios for climate change and management adaptations. Although set up for the Upper Murrumbidgee River catchment in southern NSW and ACT, the framework was planned to be transferable to other regions where suitable data are available. The approach and learning from the project appear to have the potential to be broadly applicable. We selected six climate scenarios representing minor, moderate and major changes in flow characteristics for 1oC and 2oC temperature increases. These were combined with four plausible alternative management adaptations that might be used to modify water supply, urban water demand and stream flow regimes in the Upper Murrumbidgee catchment. The Bayesian Network (BN) model structure we used was developed using both a ‘top down’ and ‘bottom up’ approach. From analyses combined with expert advice, we identified the causal structure linking climate variables to stream flow, water quality attributes, land management and ecological responses (top down). The ‘bottom up’ approach focused on key ecological outcomes and key drivers, and helped produce efficient models. The result was six models for macroinvertebrates, and one for fish. In the macroinvertebrate BN models, nodes were discretised using statistical/empirical derived thresholds using new techniques. The framework made it possible to explore how ecological communities respond to changes in climate and management activities. Particularly, we focused on the effects of water quality and quantity on ecological responses. The models showed a strong regional response reflecting differences across 18 regions in the catchment. In two regions the management alternatives were predicted to have stronger effects than climate change. In three other regions the predicted response to climate change was stronger. Analyses of water quality suggested minor changes in the probability of water quality exceeding thresholds designed to protect aquatic ecosystems. The ‘bottom up’ approach limited the framework’s transferability by being specific to the Upper Murrumbidgee catchment data. Indeed, to meet stakeholder questions models need to be specifically tailored. Therefore the report proposes a general model-building framework for transferring the approach, rather than the models, to other regions.&nbsp

    Operational Risk Assessment of Routing Flare Gas to Boiler for Cogeneration

    Get PDF
    Flaring is a controlled combustion process in which unwanted or excess hydrocarbon gases are released to flare stack for disposal. Flaring has a significant impact on environment, energy and economy. Flare gas integration to cogeneration plant is an alternative to mitigate flaring, benefiting from utilizing waste flare gas as a supplemental fuel to boilers and or gas turbines. Earlier studies have shown the energy and economic sustainability through integration. However, the impact of flare gas quality on cogeneration plants are yet to be identified. This paper studies the effect of flare gas composition and temperature from an ethylene plant to an existing boiler during abnormal flaring. The study proposes a unique framework which identifies the process hazards associated with variation in fuel conditions through process simulation and sensitivity analysis. Then, a systematic approach is used to evaluate the critical operational event occurrences and their impacts through scenario development and quantitative risk assessment, comparing a base case natural gas fuel with a variable flare gas fuel. An important outcome from this study is the identification of critical fuel stream parameters affecting the fired boiler operation through process simulation. Flare stream temperature and presence of higher molecular weight hydrocarbons in flare streams showed minimal effect on boiler condition. However, hydrogen content and rich fuel-air ratio in the boiler can affect the boiler operating conditions. Increase in the hydrogen content in flare to fuel system can increase the risk contour of cogeneration plant, affecting the boiler gas temperature, combustion mixture and flame stability inside the firebox. Quantitative risk analysis through Bayesian Network showed a significant risk escalation. With 12 hours of flare gas frequency per year, there is a substantial rise in the probability of occurrence of boiler gas temperature exceeding design limit and rich fuel mixture in the firebox due to medium and high hydrogen content gas in flare. The influence of these events on flame impingement and tube rupture incidents are noteworthy for high hydrogen content gas. The study also observed reduction in operational time as the hydrogen content in flare gas is increased from low to high. Finally, to operate fire tube steam boiler with flare gas containing higher amount of hydrogen, the existing cogeneration system needs to update its preventive safeguards to reduce the probability of loss control event

    Operational Risk Assessment of Routing Flare Gas to Boiler for Cogeneration

    Get PDF
    Flaring is a controlled combustion process in which unwanted or excess hydrocarbon gases are released to flare stack for disposal. Flaring has a significant impact on environment, energy and economy. Flare gas integration to cogeneration plant is an alternative to mitigate flaring, benefiting from utilizing waste flare gas as a supplemental fuel to boilers and or gas turbines. Earlier studies have shown the energy and economic sustainability through integration. However, the impact of flare gas quality on cogeneration plants are yet to be identified. This paper studies the effect of flare gas composition and temperature from an ethylene plant to an existing boiler during abnormal flaring. The study proposes a unique framework which identifies the process hazards associated with variation in fuel conditions through process simulation and sensitivity analysis. Then, a systematic approach is used to evaluate the critical operational event occurrences and their impacts through scenario development and quantitative risk assessment, comparing a base case natural gas fuel with a variable flare gas fuel. An important outcome from this study is the identification of critical fuel stream parameters affecting the fired boiler operation through process simulation. Flare stream temperature and presence of higher molecular weight hydrocarbons in flare streams showed minimal effect on boiler condition. However, hydrogen content and rich fuel-air ratio in the boiler can affect the boiler operating conditions. Increase in the hydrogen content in flare to fuel system can increase the risk contour of cogeneration plant, affecting the boiler gas temperature, combustion mixture and flame stability inside the firebox. Quantitative risk analysis through Bayesian Network showed a significant risk escalation. With 12 hours of flare gas frequency per year, there is a substantial rise in the probability of occurrence of boiler gas temperature exceeding design limit and rich fuel mixture in the firebox due to medium and high hydrogen content gas in flare. The influence of these events on flame impingement and tube rupture incidents are noteworthy for high hydrogen content gas. The study also observed reduction in operational time as the hydrogen content in flare gas is increased from low to high. Finally, to operate fire tube steam boiler with flare gas containing higher amount of hydrogen, the existing cogeneration system needs to update its preventive safeguards to reduce the probability of loss control event

    Biological investigation and predictive modelling of foaming in anaerobic digester

    Get PDF
    Anaerobic digestion (AD) of waste has been identified as a leading technology for greener renewable energy generation as an alternative to fossil fuel. AD will reduce waste through biochemical processes, converting it to biogas which could be used as a source of renewable energy and the residue bio-solids utilised in enriching the soil. A problem with AD though is with its foaming and the associated biogas loss. Tackling this problem effectively requires identifying and effectively controlling factors that trigger and promote foaming. In this research, laboratory experiments were initially carried out to differentiate foaming causal and exacerbating factors. Then the impact of the identified causal factors (organic loading rate-OLR and volatile fatty acid-VFA) on foaming occurrence were monitored and recorded. Further analysis of foaming and nonfoaming sludge samples by metabolomics techniques confirmed that the OLR and VFA are the prime causes of foaming occurrence in AD. In addition, the metagenomics analysis showed that the phylum bacteroidetes and proteobacteria were found to be predominant with a higher relative abundance of 30% and 29% respectively while the phylum actinobacteria representing the most prominent filamentous foam causing bacteria such as Norcadia amarae and Microthrix Parvicella had a very low and consistent relative abundance of 0.9% indicating that the foaming occurrence in the AD studied was not triggered by the presence of filamentous bacteria. Consequently, data driven models to predict foam formation were developed based on experimental data with inputs (OLR and VFA in the feed) and output (foaming occurrence). The models were extensively validated and assessed based on the mean squared error (MSE), root mean squared error (RMSE), R2 and mean absolute error (MAE). Levenberg Marquadt neural network model proved to be the best model for foaming prediction in AD, with RMSE = 5.49, MSE = 30.19 and R2 = 0.9435. The significance of this study is the development of a parsimonious and effective modelling tool that enable AD operators to proactively avert foaming occurrence, as the two model input variables (OLR and VFA) can be easily adjustable through simple programmable logic controller

    Quantified Risk and Uncertainty Analysis

    Get PDF
    The legal requirement in the UK for the duty holder of a chemical process plant to demonstrate that risk is as low as reasonably practicable (ALARP) means that quantified risk assessments (QRAs) must be accurate and robust and that identified risks are adequately mitigated. Bayesian belief networks(BBN) is an emerging technique which can be used to determine the likelihood of an event in support of the QRA process. It is a statistical method involving estimating the probability distribution for a given hypothesis. The most interesting features which distinguish this QRA technique from all the others are: ‱ it can analyse complex systems of any given number of variables and their dependability within a single analysis; ‱ it can analyse parameters over a range of probability values for any given set of conditions, providing a better understanding in terms of sensitivity analysis; ‱ it engages expert judgement and learning from previous events to update the probability distribution, thus improving QRA accuracy; and ‱ it is not just restricted to fault analysis and can be used to support plant operational decision making using a quantified approac
    • 

    corecore