24 research outputs found
From discretization to regularization of composite discontinuous functions
Discontinuities between distinct regions, described by different equation sets, cause difficulties for PDE/ODE solvers. We present a new algorithm that eliminates integrator discontinuities through regularizing discontinuities. First, the algorithm determines the optimum switch point between two functions spanning adjacent or overlapping domains. The optimum switch point is determined by searching for a “jump point” that minimizes a discontinuity between adjacent/overlapping functions. Then, discontinuity is resolved using an interpolating polynomial that joins the two discontinuous functions.
This approach eliminates the need for conventional integrators to either discretize and then link discontinuities through generating interpolating polynomials based on state variables or to reinitialize state variables when discontinuities are detected in an ODE/DAE system. In contrast to conventional approaches that handle discontinuities at the state variable level only, the new approach tackles discontinuity at both state variable and the constitutive equations level. Thus, this approach eliminates errors associated with interpolating polynomials generated at a state variable level for discontinuities occurring in the constitutive equations.
Computer memory space requirements for this approach exponentially increase with the dimension of the discontinuous function hence there will be limitations for functions with relatively high dimensions. Memory availability continues to increase with price decreasing so this is not expected to be a major limitation
A Perspective on Smart Process Manufacturing Research Challenges for Process Systems Engineers
The challenges posed by smart manufacturing for the process industries and for process systems engineering (PSE) researchers are discussed in this article. Much progress has been made in achieving plant- and site-wide optimization, but benchmarking would give greater confidence. Technical challenges confronting process systems engineers in developing enabling tools and techniques are discussed regarding flexibility and uncertainty, responsiveness and agility, robustness and security, the prediction of mixture properties and function, and new modeling and mathematics paradigms. Exploiting intelligence from big data to drive agility will require tackling new challenges, such as how to ensure the consistency and confidentiality of data through long and complex supply chains. Modeling challenges also exist, and involve ensuring that all key aspects are properly modeled, particularly where health, safety, and environmental concerns require accurate predictions of small but critical amounts at specific locations. Environmental concerns will require us to keep a closer track on all molecular species so that they are optimally used to create sustainable solutions. Disruptive business models may result, particularly from new personalized products, but that is difficult to predict
Evaluation of the effects and mechanisms of bioactive components present in hypoglycemic plants
Diabetes mellitus is a disease that is becoming increasingly prevalent worldwide. In many cases, people do not have access to synthetic drugs and make use of teas of different plants present in different countries, in order to reduce the symptoms. The plant extracts may contain bioactive compounds and may also contain toxic substances harmful to the human body. Much has been published about plants with antidiabetic activity, identifying their bioactive compounds, but there is no work in the literature that identifies the mechanisms of action of the extracts or isolated compounds of the extracts for a better understanding of the chemical reactions that occur in patients with diabetes. Therefore, this study aims to review published works that have tried to show some active mechanism of the different compounds (flavonoids, saponins, polyphenols, vitamins, etc.), to explore these mechanisms through mathematical models that can predict the benefits of these extracts to, in the future, facilitate the application of these natural products into less expensive drugs. It can be concluded that many of the extracts and isolated compounds from different hypoglycemic plants have as main mechanisms the induction of the insulin secretion, the enhancement of the number of beta cells of pancreatic islets, and have antioxidant properties
Assessing plant design with regard to MPC performance
Model Predictive Control is ubiquitous in the chemical industry and offers great advantages over traditional controllers. Notwithstanding, new plants are being projected without taking into account how design choices affect the MPC’s ability to deliver better control and optimization. Thus a methodology to determine if a certain design option favours or hinders MPC performance would be desirable. This paper presents the economic MPC optimization index whose intended use is to provide a procedure to compare different designs for a given process, assessing how well they can be controlled and optimised by a zone constrained MPC. The index quantifies the economic benefits available and how well the plant performs under MPC control given the plant’s controllability properties, requirements and restrictions. The index provides a monetization measure of expected control performance.
This approach assumes the availability of a linear state-space model valid within the control zone defined by the upper and lower bounds of each controlled and manipulated variable. We have used a model derived from simulation step tests as a practical way to use the method. The impact of model uncertainty on the methodology is discussed. An analysis of the effects of disturbances on the index illustrates how they may reduce profitability by restricting the ability of a MPC to reach dynamic equilibrium near process restrictions, which in turn increases product quality giveaway and costs. A case of study consisting of four alternative designs for a realistically sized crude oil atmospheric distillation plant is provided in order to demonstrate the applicability of the index
A process systems Engineering approach to analysis of fructose consumption in the liver system and consequences for Non-Alcoholic fatty liver disease
Metabolic disturbances to the liver system can induce lipid deposition and subsequently cause non-alcoholic fatty liver disease (NAFLD). Increasing consumption of fructose has been proposed as a crucial risk component in the development of NAFLD. Three potential therapeutic targets in the network were explored using a composite model of liver function. Introducing a fructose enriched diet under insulin resistance conditions was simulated to evaluate the effectiveness of the model in novel therapy design. In vitro experiments were conducted on rat liver samples to assess the robustness of the model predictions. Synergistic application of all three interventional points in silico has been predicted as the most effective treatment to reduce lipid production under both moderate and severe insulin resistance conditions. This study demonstrates how we can use system models together with in vivo experiments to explore the behaviour of the liver system in response to fructose variation and use it to help identify possible drug targets
Global Optimisation for Dynamic Systems using Interval Analysis
Engineers seek optimal solutions when designing dynamic systems but a crucial element is to ensure bounded performance over time. Finding a globally optimal bounded trajectory requires the solution of the ordinary differential equation (ODE) systems in a verified way. To date these methods are only able to address low dimensional problems and for larger systems are unable to prevent gross overestimation of the bounds. In this paper we show how interval contractors can be used to obtain tightly bounded optima. A verified solver constructs tight upper and lower bounds on the dynamic variables using contractors for initial value problems (IVP) for ODEs within a global optimisation method. The solver provides guaranteed bound on the objective function and on the first order sensitivity equations in a branch and bound framework. The method is compared with three previously published methods on three examples from process engineering
A simulation tool for analysis and design of reverse electrodialysis using concentrated brines
Reverse Electrodialysis (SGP-RE or RED) represents a viable technology for the conversion of the Salinity Gradient Power into electric power.
A comprehensive model is proposed for the RED process using sea or brackish water and concentrated brine as feed solutions. The goals were (i) reliably describing the physical phenomena involved in the process and (ii) providing information for optimal equipment design. For such purposes, the model has been developed at two different scales of description: a lower scale for the repeating unit of the system (cell pair), and a higher scale for the entire equipment (stack).
The model was implemented in a process simulator, validated against original experimental information and then used to investigate the influence of the main operating factors and on power output. Feed solutions of different salinities were also tested. A good matching was found between predictions and experiments for a wide range of inlet concentrations, flow rates and feed temperatures. Optimal feed conditions, for the adopted system geometry and membranes, have been found employing brackish water (0.08-0.1 M NaCl) as dilute and brine (4.5-5 M NaCl) as concentrate to generate the highest power density at 40\ub0C temperature.
The model can be used to explore the full potential of the RED technology, especially for any investigation regarding the future scale-up of the process
Analysis and simulation of scale-up potentials in reverse electrodialysis
The Reverse Electrodialysis (RED) process has been widely accepted as a viable and promising technology to produce electric energy from salinity difference (salinity gradient power - e.g. using river water/seawater, or seawater and concentrated brines). Recent R&D efforts demonstrated how an appropriate design of the RED unit and a suitable selection of process conditions may crucially enhance the process performance. With this regard, a process simulator was developed and validated with experimental data collected on a lab-scale unit, providing a new modelling tool for process optimisation.
In this work, performed within the REAPower project (www.reapower.eu), a process simulator previously proposed by the same authors has been modified in order to predict the behaviour of a cross-flow RED unit. The model was then adopted to investigate the influence of the most important variables (i.e. solution properties and stack geometry) on the overall process performance. In particular, the use of different concentrations and flow rates for the feed streams have been considered, as well as different aspect ratios in asymmetric stacks. Moreover, the influence of the scaling-up a RED unit was investigated, starting from a 22x22 cm2 100 cell pairs lab-stack, and simulating the performance of larger stacks up to a 44x88 cm2 500 cell pairs unit.
Finally, different scenarios are proposed for a prototype-scale RED plant, providing useful indications for the technology scale-up towards 1 kW of power production, relevant to the installation of a real prototype plant in Trapani (Italy) being the final objective of the R&D activities of the REAPower project
A simple multi-model prediction method
The present work introduces a new multi-model state-space formulation called simultaneous multi-linear prediction (SMLP), which is suitable for systems with significant gain variation due to nonlinearity.
Standard multi-model formulations usually make use of a partitioned state-space, i.e., a state-space that is divided into regions to shift parameters of the state update equation according to the current location of the state, with a view to having a better approximation of a nonlinear plant on each region. This multi-model framework, also known as linear hybrid systems framework, makes use of different boundaries or partition rules concepts, which vary from systems of linear inequalities, propositional logic rules, or a combination of these. This standard approach inevitably introduces discontinuities in the output prediction as the state update equation parameters shift noticeably.
Instead, the SMLP is built by defining and updating multiple states simultaneously, thus eliminating the need for partitioning the state-input space into regions and associating with each region a different state update equation. Each state’s contribution to the overall output is obtained according to the relative distance between their identification (or linearisation) point and the current operating point, in addition to a set of parameters obtained through regression analysis. Unlike the methods belonging to the hybrid systems framework, no discontinuities are introduced in the output prediction while using an SMLP system, as the weighting function is continuous and the transition between sub-models is smooth.
This method presents more accurate results than the use of single linear models while keeping much of their numerical advantages and their relative ease of development. Additionally, the SMLP draws data from step response models that can be provided by commercial, black box dynamic simulators, enabling it to be applied to large-scale systems.
In order to assess this methodology, an SMLP system is built for an activated sludge process (ASP) of a wastewater treatment plant, alongside a standard multi-model Piecewise Affine system generated by the same sub-models, and their output predictions are compared.
The controllability analysis and the case study presented in Strutzel and Bogle (2016) are extended and updated to this multi-model approach, yielding SMLP systems describing four alternative designs for a realistically sized crude oil atmospheric distillation plant
A quantitative risk analysis approach to a process sequence under uncertainty - A case study
Process plants for the manufacture of pharmaceutical products often need to be designed and built quickly to make the most of available patent life which necessitates using uncertain or unavailable data. It is common that pilot plant equipment and data are available and new data can be generated if they are important. We present a model based approach to risk analysis to aid design for pharmaceutical processes which combines systematic modelling procedures with Hammersley sampling based uncertainty analysis and sensitivity analysis used to quantify predicted performance uncertainty and to identify key uncertainty contributions. The main contribution of the paper is the demonstration of the methodology on an industrial case study where the process flowsheet was fixed and some pilot data was available. Expected performance was improved by considering the propagation of uncertainty over the whole process. The case study results indicate the importance of considering uncertainty systematically and quantitatively. The methodology showed the opportunity to improve process performance potential through considering uncertainty systematically