346 research outputs found

    Environmental Research Newsletter June 1991 No. 7

    Get PDF

    Comprehensive quantitative dynamic accident modelling framework for chemical plants

    Get PDF
    This thesis introduces a comprehensive accident modelling approach that considers hazards associated with process plants including those that originate from the process itself; human factors including management and organizational errors; natural events related hazards; and intentional and security hazards in a risk assessment framework. The model is based on a series of plant protection systems, which are release, dispersion, ignition, toxicity, escalation, and damage control and emergency management prevention barriers. These six prevention barriers are arranged according to a typical sequence of accident propagation path. Based on successes and failures of these barriers, a spectrum of consequences is generated. Each consequence carries a unique probability of occurrence determined using event tree analysis. To facilitate this computation, the probability of failure for each prevention barrier is computed using fault tree analysis. In carrying out these computations, reliability data from established database are utilized. On occasion where reliability data is lacking, expert judgment is used, and evidence theory is applied to aggregate these experts’ opinion, which might be conflicting. This modelling framework also provides two important features; (i) the capability to dynamically update failure probabilities of prevention barriers based on precursor data, and (ii) providing prediction of future events. The first task is achieved effectively using Bayesian theory; while in the second task, Bayesian-grey model emerged as the most promising strategy with overall mean absolute percentage error of 18.07% based on three case studies, compared to 31.4% for the Poisson model, 22.37% for the first-order grey model, and 22.4% for the second-order grey model. The results obtained illustrated the potentials of the proposed modelling strategy in anticipating failures, identifying the location of failures and predicting future events. These insights are important in planning targeted plant maintenance and management of change, in addition to facilitating the implementation of standard operating procedures in a process plant

    A Modeling, Optimization, and Analysis Framework for Designing Multi-Product Lignocellulosic Biorefineries

    Get PDF
    The objective of this research is to propose a methodology to develop modular decision analysis frameworks to design value chains for enterprises in the renewable fuels and chemicals sector. The decision support framework focuses on providing strategic decision support to startup and new product ventures. The tasks that are embedded in the framework include process and systems design, technology and product selection, forecasting cost and market variables, designing network capacities, and analysis of risks. The Decision support system (DSS) proposed is based on optimization modeling; systems design are carried out using integer programming with multiple sets of process and network configurations utilized as inputs. Uncertainty is incorporated using real options, which are utilized to design network processing capacity for the conversion of biomass resources. Risk analysis is carried out using Monte Carlo methods. The DSS framework is exemplified using a lignocellulosic biorefinery case study that is assumed to be located in Louisiana. The biorefinery utilizes energy crops as feedstocks and processes them into cellulosic biofuels and biobased chemicals. Optimization modeling is utilized to select an optimal network, a fractionation technology, a fermentation configuration, and optimal product recovery and purification unit operations. A decision tree is then used to design incremental capacity under uncertain market parameters. The valuation methodology proposed stresses flexibility in decision making in the face of market uncertainties as is the case with renewable fuels and chemicals. The value of flexibility, termed as “Option Value” is shown to significantly improve the net present value of the proposed biorefinery. Monte Carlo simulations are utilized to develop risk curves for alternate capacity design plans. Risk curves show a favorable risk reward ratio for the case of incremental capacity design with embedded decision options. The framework proposed here can be used by enterprises, government entities and decision makers in general to test, validate, and design technological superstructures and network processing capacities, conduct scenario analyses, and quantify the financial impacts and risks of their representative designs. We plan to further add functionality to the DSS framework and make available the tools developed to wide audience through an “open-source” software distribution model

    Process Resilience Analysis Framework for Design and Operations

    Get PDF
    Process plants are complex socio-technical systems that degrade gradually and change with advancing technology. This research deals with exploring and answering questions related to the uncertainties involved in the process systems, and their complexity. It aims to systematically integrate resilience in process design and operations through three different phases of prediction, survival, and recovery using a novel framework called Process Resilience Analysis Framework (PRAF). The analysis relies on simulation, data-driven models and optimization approach employing the resilience metrics developed in this research. In particular, an integrated method incorporating aspects of process operations, equipment maintenance, and process safety is developed for the following three phases: •Prediction: to find the feasible operating region under changing conditions using Bayesian approach, global sensitivity analysis, and robust simulation methods, •Survival: to determine optimal operations and maintenance strategies using simulation, Bayesian regression analysis, and optimization, and •Recovery: to develop a strategy for emergency barriers in abnormal situations using dynamic simulation, Bayesian analysis, and optimization. Examples of a batch reactor, and cooling tower operations process unit are used to illustrate the application of PRAF. The results demonstrate that PRAF is successful in capturing the interactions between the process operability characteristics, maintenance, and safety policy. The prediction phase analysis leads to good dynamic response and stability of operations. The survival phase helps in the reduction of unplanned shutdown and downtime. The recovery phase results in in reduced severity of consequences, and response time and overall enhanced recovery. Overall, PRAF achieves flexibility, controllability and reliability of the system, supports more informed decision-making and profitable process systems

    Critical Services continuity, Resilience and Security: Proceedings of the 56th ESReDA Seminar

    Get PDF
    Critical Infrastructures (CIs) remain among the most important and vital service providers to modern societies. Severe CIs’ disruptions may endanger security of the citizen, availability of strategic assets and even the governance stability. Not surprisingly, CIs are often targets of intentional attacks, either of physical or cyber nature. Newly emerging hybrid threats primarily target CIs as part of the warfare. ESReDA as one of the most active EU networks in the field has initiated a project group (CI-PR/MS&A-Data) on the “Critical Infrastructure/Modelling, Simulation and Analysis – Data”. The main focus of the project group is to report on the state of progress in MS&A of the CIs preparedness & resilience with a specific focus on the corresponding data availability and relevance. In order to report on the most recent developments in the field of the CIs preparedness & resilience MS&A and the availability of the relevant data, ESReDA held its 48th, 52nd and 56th Seminars. The 56th ESReDA Seminar on “Critical Services continuity, Resilience and Security” attracted about 30 participants from industry, authorities, operators, research centres and academia. The seminar programme consisted of 18 technical papers, two plenary speeches and an interactive session on Climate & CI protection.JRC.G.10-Knowledge for Nuclear Security and Safet

    NASA Lewis Research Center Futuring Workshop

    Get PDF
    On October 21 and 22, 1986, the Futures Group ran a two-day Futuring Workshop on the premises of NASA Lewis Research Center. The workshop had four main goals: to acquaint participants with the general history of technology forecasting; to familiarize participants with the range of forecasting methodologies; to acquaint participants with the range of applicability, strengths, and limitations of each method; and to offer participants some hands-on experience by working through both judgmental and quantitative case studies. Among the topics addressed during this workshop were: information sources; judgmental techniques; quantitative techniques; merger of judgment with quantitative measurement; data collection methods; and dealing with uncertainty

    State Estimation, Covariance Estimation, and Economic Optimization of Semi-Batch Bioprocesses

    Get PDF
    One of the most critical aspects of any chemical process engineer is the ability to gather, analyze, and trust incoming process data as it is often required in control and process monitoring applications. In real processes, online data can be unreliable due to factors such as poor tuning, calibration drift, or mechanical drift. Outside of these sources of noise, it may not be economically viable to directly measure all process states of interest (e.g., component concentrations). While process models can help validate incoming process data, models are often subject to plant-model mismatches, unmodeled disturbances, or lack enough detail to track all process states (e.g., dissolved oxygen in a bioprocess). As a result, directly utilizing the process data or the process model exclusively in these applications is often not possible or simply results in suboptimal performance. To address these challenges and achieve a higher level of confidence in the process states, estimation theory is used to blend online measurements and process models together to derive a series of state estimates. By utilizing both sources, it is possible to filter out the noise and derive a state estimate close to the true process conditions. This work deviates from the traditional state estimation field that mostly addresses continuous processes and examines how techniques such as extended Kalman Filter (EKF) and moving horizon estimation (MHE) can be applied to semi-batch processes. Additionally, this work considers how plant-model mismatches can be overcome through parameter-based estimation algorithms such as Dual EKF and a novel parameter-MHE (P-MHE) algorithm. A galacto-oligosaccharide (GOS) process is selected as the motivating example as some process states are unable to be independently measured online and require state estimation to be implemented. Moreover, this process is representative of the broader bioprocess field as it is subject to high amounts of noise, less rigorous models, and is traditionally operated using batch/semi-batch reactors. In conjunction with employing estimation approaches, this work also explores how to effectively tune these algorithms. The estimation algorithms selected in this work require careful tuning of the model and measurement covariance matrices to balance the uncertainties between the process models and the incoming measurements. Traditionally, this is done via ad-hoc manual tuning from process control engineers. This work modifies and employs techniques such as direct optimization (DO) and autocovariance least-squares (ALS) to accurately estimate the covariance values. Poor approximation of the covariances often results in poor estimation of the states or drives the estimation algorithm to failure. Finally, this work develops a semi-batch specific dynamic real-time optimization (DRTO) algorithm and poses a novel costing methodology for this specific type of problem. As part of this costing methodology, an enzyme specific cost scaling correlation is proposed to provide a realistic approximation of these costs in industrial contexts. This semi-batch DRTO is combined with the GOS process to provide an economic analysis using Kluyveromyces lactis (K. lactis) β-galactosidase enzyme. An extensive literature review is carried out to support the conclusions of the economic analysis and motivate application to other bioprocesses
    • …
    corecore