176 research outputs found

    ONLINE APPROXIMATION ASSISTED MULTIOBJECTIVE OPTIMIZATION WITH HEAT EXCHANGER DESIGN APPLICATIONS

    Get PDF
    Computer simulations can be intensive as is the case in Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA). The computational cost can become prohibitive when using these simulations with multiobjective design optimization. One way to address this issue is to replace a computationally intensive simulation by an approximation which allows for a quick evaluation of a large number of design alternatives as needed by an optimizer. This dissertation proposes an approach for multiobjective design optimization when combined with computationally expensive simulations for heat exchanger design problems. The research is performed along four research directions. These are: (1) a new Online Approximation Assisted Multiobjective Optimization (OAAMO) approach with a focus on the expected optimum region, (2) a new approximation assisted multiobjective optimization with global and local metamodeling that always produces feasible solutions, (3) a framework that integrates OAAMO with multiscale simulations (OAAMOMS) for design of heat exchangers at the segment and heat exchanger levels, and (4) applications of OAAMO combined with CFD for shape design of a header for a new generation of heat exchangers using Non-Uniform Rational B-Splines (NURBS). The approaches developed in this thesis are also applied to optimize a coldplate used in electronic cooling devices and different types of plate heat exchangers. In addition many numerical test problems are solved by the proposed methods. The results of these studies show that the proposed online approximation assisted multiobjective optimization is an efficient approach that can be used to predict optimum solutions for a wide class of problems including heat exchanger design problems while reducing significantly the computational cost when compared with existing methods

    A multi-objective evolutionary approach to simulation-based optimisation of real-world problems.

    Get PDF
    This thesis presents a novel evolutionary optimisation algorithm that can improve the quality of solutions in simulation-based optimisation. Simulation-based optimisation is the process of finding optimal parameter settings without explicitly examining each possible configuration of settings. An optimisation algorithm generates potential configurations and sends these to the simulation, which acts as an evaluation function. The evaluation results are used to refine the optimisation such that it eventually returns a high-quality solution. The algorithm described in this thesis integrates multi-objective optimisation, parallelism, surrogate usage, and noise handling in a unique way for dealing with simulation-based optimisation problems incurred by these characteristics. In order to handle multiple, conflicting optimisation objectives, the algorithm uses a Pareto approach in which the set of best trade-off solutions is searched for and presented to the user. The algorithm supports a high degree of parallelism by adopting an asynchronous master-slave parallelisation model in combination with an incremental population refinement strategy. A surrogate evaluation function is adopted in the algorithm to quickly identify promising candidate solutions and filter out poor ones. A novel technique based on inheritance is used to compensate for the uncertainties associated with the approximative surrogate evaluations. Furthermore, a novel technique for multi-objective problems that effectively reduces noise by adopting a dynamic procedure in resampling solutions is used to tackle the problem of real-world unpredictability (noise). The proposed algorithm is evaluated on benchmark problems and two complex real-world problems of manufacturing optimisation. The first real-world problem concerns the optimisation of a production cell at Volvo Aero, while the second one concerns the optimisation of a camshaft machining line at Volvo Cars Engine. The results from the optimisations show that the algorithm finds better solutions for all the problems considered than existing, similar algorithms. The new techniques for dealing with surrogate imprecision and noise used in the algorithm are identified as key reasons for the good performance.University of Skövde Knowledge Foundation Swede

    Online and Offline Approximations for Population based Multi-objective Optimization

    Get PDF
    The high computational cost of population based optimization methods has been preventing applications of these methods to realistic engineering design problems. The main challenge is to devise approaches that can significantly reduce the number of function (or simulation) calls required in such optimization methods. This dissertation presents some new online and offline approximation approaches for design optimization. In particular, it presents new DOE and metamodeling techniques for Genetic Algorithm (GA) based multi-objective optimization methods along four research thrusts. The first research thrust is called: Online Metamodeling Assisted Fitness Evaluation. In this thrust, a new online metamodeling assisted fitness evaluation approach is developed that aims at significantly reducing the number of function calls in each generation of a Multi-Objective Genetic Algorithm (MOGA) for design optimization. The second research thrust is called: DOE in Online Metamodeling. This research thrust introduces a new DOE method that aims at reducing the number of generations in a MOGA. It is shown that the method developed under the second research thrust can, compared to the method in the first thrust, further reduce the number of function calls in the MOGA. The third research thrust is called: DOE in Offline Metamodeling. In this thrust, a new DOE method is presented for sampling points in the non-smooth regions of a design space in order to improve the accuracy of a metamodel. The method under the third thrust is useful in approximation assisted optimization when the number of available function calls is limited. Finally, the fourth research thrust is called: Dependent Metamodeling for Multi-Response Simulations. This research thrust presents a new metamodeling technique for an engineering simulation that has multiple responses. Numerous numerical and engineering examples are used to demonstrate the applicability and performance of the proposed online and offline approximation techniques

    Adaptive swarm optimisation assisted surrogate model for pipeline leak detection and characterisation.

    Get PDF
    Pipelines are often subject to leakage due to ageing, corrosion and weld defects. It is difficult to avoid pipeline leakage as the sources of leaks are diverse. Various pipeline leakage detection methods, including fibre optic, pressure point analysis and numerical modelling, have been proposed during the last decades. One major issue of these methods is distinguishing the leak signal without giving false alarms. Considering that the data obtained by these traditional methods are digital in nature, the machine learning model has been adopted to improve the accuracy of pipeline leakage detection. However, most of these methods rely on a large training dataset for accurate training models. It is difficult to obtain experimental data for accurate model training. Some of the reasons include the huge cost of an experimental setup for data collection to cover all possible scenarios, poor accessibility to the remote pipeline, and labour-intensive experiments. Moreover, datasets constructed from data acquired in laboratory or field tests are usually imbalanced, as leakage data samples are generated from artificial leaks. Computational fluid dynamics (CFD) offers the benefits of providing detailed and accurate pipeline leakage modelling, which may be difficult to obtain experimentally or with the aid of analytical approach. However, CFD simulation is typically time-consuming and computationally expensive, limiting its pertinence in real-time applications. In order to alleviate the high computational cost of CFD modelling, this study proposed a novel data sampling optimisation algorithm, called Adaptive Particle Swarm Optimisation Assisted Surrogate Model (PSOASM), to systematically select simulation scenarios for simulation in an adaptive and optimised manner. The algorithm was designed to place a new sample in a poorly sampled region or regions in parameter space of parametrised leakage scenarios, which the uniform sampling methods may easily miss. This was achieved using two criteria: population density of the training dataset and model prediction fitness value. The model prediction fitness value was used to enhance the global exploration capability of the surrogate model, while the population density of training data samples is beneficial to the local accuracy of the surrogate model. The proposed PSOASM was compared with four conventional sequential sampling approaches and tested on six commonly used benchmark functions in the literature. Different machine learning algorithms are explored with the developed model. The effect of the initial sample size on surrogate model performance was evaluated. Next, pipeline leakage detection analysis - with much emphasis on a multiphase flow system - was investigated in order to find the flow field parameters that provide pertinent indicators in pipeline leakage detection and characterisation. Plausible leak scenarios which may occur in the field were performed for the gas-liquid pipeline using a three-dimensional RANS CFD model. The perturbation of the pertinent flow field indicators for different leak scenarios is reported, which is expected to help in improving the understanding of multiphase flow behaviour induced by leaks. The results of the simulations were validated against the latest experimental and numerical data reported in the literature. The proposed surrogate model was later applied to pipeline leak detection and characterisation. The CFD modelling results showed that fluid flow parameters are pertinent indicators in pipeline leak detection. It was observed that upstream pipeline pressure could serve as a critical indicator for detecting leakage, even if the leak size is small. In contrast, the downstream flow rate is a dominant leakage indicator if the flow rate monitoring is chosen for leak detection. The results also reveal that when two leaks of different sizes co-occur in a single pipe, detecting the small leak becomes difficult if its size is below 25% of the large leak size. However, in the event of a double leak with equal dimensions, the leak closer to the pipe upstream is easier to detect. The results from all the analyses demonstrate the PSOASM algorithm's superiority over the well-known sequential sampling schemes employed for evaluation. The test results show that the PSOASM algorithm can be applied for pipeline leak detection with limited training datasets and provides a general framework for improving computational efficiency using adaptive surrogate modelling in various real-life applications

    Finalised dependability framework and evaluation results

    Get PDF
    The ambitious aim of CONNECT is to achieve universal interoperability between heterogeneous Networked Systems by means of on-the-fly synthesis of the CONNECTors through which they communicate. The goal of WP5 within CONNECT is to ensure that the non-functional properties required at each side of the connection going to be established are fulfilled, including dependability, performance, security and trust, or, in one overarching term, CONNECTability. To model such properties, we have introduced the CPMM meta-model which establishes the relevant concepts and their relations, and also includes a Complex Event language to express the behaviour associated with the specified properties. Along the four years of project duration, we have developed approaches for assuring CONNECTability both at synthesis time and at run-time. Within CONNECT architecture, these approaches are supported via the following enablers: the Dependability and Performance analysis Enabler, which is implemented in a modular architecture supporting stochastic verification and state-based analysis. Dependability and performance analysis also relies on approaches for incremental verification to adjust CONNECTor parameters at run-time; the Security Enabler, which implements a Security-by-Contract-with-Trust framework to guarantee the expected security policies and enforce them accordingly to the level of trust; the Trust Manager that implements a model-based approach to mediate between different trust models and ensure interoperable trust management. The enablers have been integrated within the CONNECT architecture, and in particular can interact with the CONNECT event-based monitoring enabler (GLIMPSE Enabler released within WP4) for run-time analysis and verification. To support a Model-driven approach in the interaction with the monitor, we have developed a CPMM editor and a translator from CPMM to the GLIMPSE native language (Drools). In this document that is the final deliverable from WP5 we first present the latest advances in the fourth year concerning CPMM, Dependability&Performance Analysis, Incremental Verification and Security. Then, we make an overall summary of main achievements for the whole project lifecycle. In appendix we also include some relevant articles specifically focussing on CONNECTability that have been prepared in the last period

    Toward a fast and accurate modeling strategy for thermal management in air-cooled data centers

    Get PDF
    Computational fluid dynamics (CFD) has become a popular tool compared to experimental measurement for thermal management in data centers. However, it is very time-consuming and resource-intensive when used to model large-scale data centers, and may not be ready for real-time thermal monitoring. In this thesis, the two main goals are first to develop rapid flow simulation to reduce the computing time while maintaining good accuracy, and second, to develop a whole building energy simulation (BES) strategy for data center modeling. To achieve this end, hybrid modeling and model training methodologies are investigated for rapid flow simulation, and a multi-zone model is proposed for BES. In the scope of hybrid modeling, two methods are proposed, i.e., the hybrid zero/two-equation turbulence model utilizing the zone partitioning technique and a combination of turbulence and floor tile models for the development of the composite performance index. It shows that the zero-equation coupled with either body force and modified body force tile models have the best potential in reducing the computing time, while preserving reasonable accuracy. The hybrid zero/two-equation method cuts down the computing time in half compared to the traditional practice of using only two-equation model. In the scope of model training, reduced order method via proper orthogonal decomposition (POD) and response surface methodology (RSM) are comprehensively studied for data center modeling. Both methods can quickly reconstruct the data center thermal profile and retain good accuracy. The RSM method especially shows numerous advantages in several optimization studies of data centers. Whether it is for the tile selection to control the server rack temperature difference or impacting the decision for the input design parameters in the early stage of data center infrastructure design, RSM can replace the costly experiments and the time-consuming and resource-intensive CFD simulations. Finally, for the whole BES study, the proposed multi-zone model is found to be much more effective compared to the common use single zone model. The location factor plays an important role in deciding whether some of boundary conditions are affecting the cooling electricity consumption. In addition, the effect of supply temperature and volumetric flow rate have significant effects on the energy consumption

    Systems Engineering

    Get PDF
    The book "Systems Engineering: Practice and Theory" is a collection of articles written by developers and researches from all around the globe. Mostly they present methodologies for separate Systems Engineering processes; others consider issues of adjacent knowledge areas and sub-areas that significantly contribute to systems development, operation, and maintenance. Case studies include aircraft, spacecrafts, and space systems development, post-analysis of data collected during operation of large systems etc. Important issues related to "bottlenecks" of Systems Engineering, such as complexity, reliability, and safety of different kinds of systems, creation, operation and maintenance of services, system-human communication, and management tasks done during system projects are addressed in the collection. This book is for people who are interested in the modern state of the Systems Engineering knowledge area and for systems engineers involved in different activities of the area. Some articles may be a valuable source for university lecturers and students; most of case studies can be directly used in Systems Engineering courses as illustrative materials

    Augmented Conversation and Cognitive Apprenticeship Metamodel Based Intelligent Learning Activity Builder System

    Get PDF
    This research focused on a formal (theory based) approach to designing Intelligent Tutoring System (ITS) authoring tool involving two specific conventional pedagogical theories—Conversation Theory (CT) and Cognitive Apprenticeship (CA). The research conceptualised an Augmented Conversation and Cognitive Apprenticeship Metamodel (ACCAM) based on apriori theoretical knowledge and assumptions of its underlying theories. ACCAM was implemented in an Intelligent Learning Activity Builder System (ILABS)—an ITS authoring tool. ACCAM’s implementation aims to facilitate formally designed tutoring systems, hence, ILABS―the practical implementation of ACCAM― constructs metamodels for Intelligent Learning Activity Tools (ILATs) in a numerical problem-solving context (focusing on the construction of procedural knowledge in applied numerical disciplines). Also, an Intelligent Learning Activity Management System (ILAMS), although not the focus of this research, was developed as a launchpad for ILATs constructed and to administer learning activities. Hence, ACCAM and ILABS constitute the conceptual and practical contributions that respectively flow from this research. ACCAM’s implementation was tested through the evaluation of ILABS and ILATs within an applied numerical domain―the accounting domain. The evaluation focused on the key constructs of ACCAM―cognitive visibility and conversation, implemented through a tutoring strategy employing Process Monitoring (PM). PM augments conversation within a cognitive apprenticeship framework; it aims to improve the visibility of the cognitive process of a learner and infers intelligence in tutoring systems. PM was implemented via an interface that attempts to bring learner’s thought process to the surface. This approach contrasted with previous studies that adopted standard Artificial Intelligence (AI) based inference techniques. The interface-based PM extends the existing CT and CA work. The strategy (i.e. interface-based PM) makes available a new tutoring approach that aimed fine-grain (or step-wise) feedbacks, unlike the goal-oriented feedbacks of model-tracing. The impact of PM—as a preventive strategy (or intervention) and to aid diagnosis of learners’ cognitive process—was investigated in relation to other constructs from the literature (such as detection of misconception, feedback generation and perceived learning effectiveness). Thus, the conceptualisation and implementation of PM via an interface also contributes to knowledge and practice. The evaluation of the ACCAM-based design approach and investigation of the above mentioned constructs were undertaken through users’ reaction/perception to ILABS and ILAT. This involved, principally, quantitative approach. However, a qualitative approach was also utilised to gain deeper insight. Findings from the evaluation supports the formal (theory based) design approach—the design of ILABS through interaction with ACCAM. Empirical data revealed the presence of conversation and cognitive visibility constructs in ILATs, which were determined through its behaviour during the learning process. This research identified some other theoretical elements (e.g. motivation, reflection, remediation, evaluation, etc.) that possibly play out in a learning process. This clarifies key conceptual variables that should be considered when constructing tutoring systems for applied numerical disciplines (e.g. accounting, engineering). Also, the research revealed that PM enhances the detection of a learner’s misconception and feedback generation. Nevertheless, qualitative data revealed that frequent feedbacks due to the implementation of PM could be obstructive to thought process at advance stage of learning. Thus, PM implementations should also include delayed diagnosis, especially for advance learners who prefer to have it on request. Despite that, current implementation allows users to turn PM off, thereby using alternative learning route. Overall, the research revealed that the implementation of interface-based PM (i.e. conversation and cognitive visibility) improved the visibility of learner’s cognitive process, and this in turn enhanced learning—as perceived
    • …
    corecore