18,868 research outputs found
Recommended from our members
Algorithm development and analysis for on-line optimising control of large scale industrial processes
The work presented in the thesis is concerned with on-line optimising control of large scale industrial processes. Theoretical analysis has been carried out to investigate optimality and convergence features of various optimising control algorithms in both centralised and hierarchical forms, providing a basis for algorithm design and assessment. Important issues, such as iterative strategies, coordination methods and feedback structures, concerning the improvement of algorithm efficiency are explored. An improved price updating formula is proposed and implemented in the single iterative loop Integrated Optimisation and Parameter Estimation (ISOPE) structure with global feedback to further improve the convergence features of the algorithm. A new coordination technique â Modifier Coordination (MC) method is proposed and implemented in both single and double iterative ISOPE structures. Approaches for coping with output dependent constraints are examined and the Penalty Relaxation (PR) technique is integrated into the ISOPE structure to extend the existing ISOPE algorithms so as to cover many output dependent cases. Comparative studies of some newly developed algorithms, techniques and methods based on substantial computer simulations are also provided. Issues concerning software implementation of optimising control algorithms are discussed providing a general guideline to such practices. Suggestions for future research as a continuation of the work presented in this thesis are also made
Is agile project management applicable to construction?
This paper briefly summarises the evolution of Agile Project Management (APM) and differentiates it from lean and agile production and âleagileâ construction. The significant benefits being realized through employment of APM within the information systems industry are stated. The characteristics of APM are explored, including: philosophy, organizational attitudes and practices, planning, execution and control and learning. Finally, APM is subjectively assessed as to its potential contribution to the pre-design, design and construction phases.
In conclusion, it is assessed that APM offers considerable potential for application in predesign and design but that there are significant hurdles to its adoption in the actual construction phase. Should these be overcome, APM offers benefits well beyond any individual project
Iterative learning control of crystallisation systems
Under the increasing pressure of issues like reducing the time to market, managing lower production costs, and improving the flexibility of operation, batch process industries thrive towards the production of high value added commodity, i.e. specialty chemicals, pharmaceuticals, agricultural, and biotechnology enabled products. For better design, consistent operation and improved control of batch chemical processes one cannot ignore the sensing and computational blessings provided by modern sensors, computers, algorithms, and software. In addition, there is a growing demand for modelling and control tools based on process operating data. This study is focused on developing process operation data-based iterative learning control (ILC) strategies for batch processes, more specifically for batch crystallisation systems.
In order to proceed, the research took a step backward to explore the existing control strategies, fundamentals, mechanisms, and various process analytical technology (PAT) tools used in batch crystallisation control. From the basics of the background study, an operating data-driven ILC approach was developed to improve the product quality from batch-to-batch. The concept of ILC is to exploit the repetitive nature of batch processes to automate recipe updating using process knowledge obtained from previous runs. The methodology stated here was based on the linear time varying (LTV) perturbation model in an ILC framework to provide a convergent batch-to-batch improvement of the process performance indicator. In an attempt to create uniqueness in the research, a novel hierarchical ILC (HILC) scheme was proposed for the systematic design of the supersaturation control (SSC) of a seeded batch cooling crystalliser. This model free control approach is implemented in a hierarchical structure by assigning data-driven supersaturation controller on the upper level and a simple temperature controller in the lower level.
In order to familiarise with other data based control of crystallisation processes, the study rehearsed the existing direct nucleation control (DNC) approach. However, this part was more committed to perform a detailed strategic investigation of different possible structures of DNC and to compare the results with that of a first principle model based optimisation for the very first time. The DNC results in fact outperformed the model based optimisation approach and established an ultimate guideline to select the preferable DNC structure.
Batch chemical processes are distributed as well as nonlinear in nature which need to be operated over a wide range of operating conditions and often near the boundary of the admissible region. As the linear lumped model predictive controllers (MPCs) often subject to severe performance limitations, there is a growing demand of simple data driven nonlinear control strategy to control batch crystallisers that will consider the spatio-temporal aspects. In this study, an operating data-driven polynomial chaos expansion (PCE) based nonlinear surrogate modelling and optimisation strategy was presented for batch crystallisation processes. Model validation and optimisation results confirmed this approach as a promise to nonlinear control.
The evaluations of the proposed data based methodologies were carried out by simulation case studies, laboratory experiments and industrial pilot plant experiments. For all the simulation case studies a detailed mathematical models covering reaction kinetics and heat mass balances were developed for a batch cooling crystallisation system of Paracetamol in water. Based on these models, rigorous simulation programs were developed in MATLABÂź, which was then treated as the real batch cooling crystallisation system. The laboratory experimental works were carried out using a lab scale system of Paracetamol and iso-Propyl alcohol (IPA). All the experimental works including the qualitative and quantitative monitoring of the crystallisation experiments and products demonstrated an inclusive application of various in situ process analytical technology (PAT) tools, such as focused beam reflectance measurement (FBRM), UV/Vis spectroscopy and particle vision measurement (PVM) as well. The industrial pilot scale study was carried out in GlaxoSmithKline Bangladesh Limited, Bangladesh, and the system of experiments was Paracetamol and other powdered excipients used to make paracetamol tablets.
The methodologies presented in this thesis provide a comprehensive framework for data-based dynamic optimisation and control of crystallisation processes. All the simulation and experimental evaluations of the proposed approaches emphasised the potential of the data-driven techniques to provide considerable advances in the current state-of-the-art in crystallisation control
A scalable parallel finite element framework for growing geometries. Application to metal additive manufacturing
This work introduces an innovative parallel, fully-distributed finite element
framework for growing geometries and its application to metal additive
manufacturing. It is well-known that virtual part design and qualification in
additive manufacturing requires highly-accurate multiscale and multiphysics
analyses. Only high performance computing tools are able to handle such
complexity in time frames compatible with time-to-market. However, efficiency,
without loss of accuracy, has rarely held the centre stage in the numerical
community. Here, in contrast, the framework is designed to adequately exploit
the resources of high-end distributed-memory machines. It is grounded on three
building blocks: (1) Hierarchical adaptive mesh refinement with octree-based
meshes; (2) a parallel strategy to model the growth of the geometry; (3)
state-of-the-art parallel iterative linear solvers. Computational experiments
consider the heat transfer analysis at the part scale of the printing process
by powder-bed technologies. After verification against a 3D benchmark, a
strong-scaling analysis assesses performance and identifies major sources of
parallel overhead. A third numerical example examines the efficiency and
robustness of (2) in a curved 3D shape. Unprecedented parallelism and
scalability were achieved in this work. Hence, this framework contributes to
take on higher complexity and/or accuracy, not only of part-scale simulations
of metal or polymer additive manufacturing, but also in welding, sedimentation,
atherosclerosis, or any other physical problem where the physical domain of
interest grows in time
The LifeV library: engineering mathematics beyond the proof of concept
LifeV is a library for the finite element (FE) solution of partial
differential equations in one, two, and three dimensions. It is written in C++
and designed to run on diverse parallel architectures, including cloud and high
performance computing facilities. In spite of its academic research nature,
meaning a library for the development and testing of new methods, one
distinguishing feature of LifeV is its use on real world problems and it is
intended to provide a tool for many engineering applications. It has been
actually used in computational hemodynamics, including cardiac mechanics and
fluid-structure interaction problems, in porous media, ice sheets dynamics for
both forward and inverse problems. In this paper we give a short overview of
the features of LifeV and its coding paradigms on simple problems. The main
focus is on the parallel environment which is mainly driven by domain
decomposition methods and based on external libraries such as MPI, the Trilinos
project, HDF5 and ParMetis.
Dedicated to the memory of Fausto Saleri.Comment: Review of the LifeV Finite Element librar
Towards a Model II Theory-in-use for young software engineers and small sofware teams
International audienceSmall teams have to transform in a learning organization to cope with the changes in IT. Argyris and Schon distinguish single-loop and double-loop learning [9]. Single loop learning happens when unintended or counterproductive consequences lead to a change in action but not in the governing variables. Another possibility is to change the governing variables themselves and is called double-loop learning. Single-loop learning is induced from Model I, a prevalent model of theories-in-use - those that can be inferred from action -. Argyris and Schon look to move people from a Model I to a Model II that fosters double-loop learning. In the software engineering field - and especially in small teams, developing a reflective thinking and enhanced learning is a vital issue. We intended to develop these issues in the course of a Master program in Information Technology and Software Engineering. The last year of this program is performed under 'sandwich' conditions with an alternation of study periods in university and training periods in industry. Moreover, alternated university periods are dedicated to a long-term team software project. The education system is a reflective practicum. Such a practicum provides students, working in groups, with the possibility to reflect on her/his action and that may help making explicit theories-in-use. Several reflective practices are seamed in the course of the project providing an students with education of reflective thinking. The work placement system introduces a new challenge that is to relate the university and industrial phases of the student's experience. We propose to use journal writing as a tool to record young engineers' behavior and to extract meaning from events and experiences. The first goal of these different practices is to sustain a reflective thought that may help to question espoused theories and to reveal theories-in-use; a more ambitious goal is that the whole team acts as a learning organization with a theory-in-use mastered by Model II. We report on an experimental case study using a project journal supported by semantic wikis
Recommended from our members
Towards Informed Exploration for Deep Reinforcement Learning
In this thesis, we discuss various techniques for improving exploration for deep reinforcement learning. We begin with a brief review of reinforcement learning (RL) and the fundamental v.s. exploitation trade-off. Then we review how deep RL has improved upon classical and summarize six categories of the latest exploration methods for deep RL, in the order increasing usage of prior information. We then explore representative works in three categories discuss their strengths and weaknesses. The first category, represented by Soft Q-learning, uses regularization to encourage exploration. The second category, represented by count-based via hashing, maps states to hash codes for counting and assigns higher exploration to less-encountered states. The third category utilizes hierarchy and is represented by modular architecture for RL agents to play StarCraft II. Finally, we conclude that exploration by prior knowledge is a promising research direction and suggest topics of potentially impact
- âŠ