1,912 research outputs found

    System Architecture Design Using Multi-Criteria Optimization

    Get PDF
    System architecture is defined as the description of a complex system in terms of its functional requirements, physical elements and their interrelationships. Designing a complex system architecture can be a difficult task involving multi-faceted trade-off decisions. The system architecture designs often have many project-specific goals involving mix of quantitative and qualitative criteria and a large design trade space. Several tools and methods have been developed to support the system architecture design process in the last few decades. However, many conventional problem solving techniques face difficulties in dealing with complex system design problems having many goals. In this research work, an interactive multi-criteria design optimization framework is proposed for solving many-objective system architecture design problems and generating a well distributed set of Pareto optimal solutions for these problems. System architecture design using multi-criteria optimization is demonstrated using a real-world application of an aero engine health management (EHM) system. A design process is presented for the optimal deployment of the EHM system functional operations over physical architecture subsystems. The EHM system architecture design problem is formulated as a multi-criteria optimization problem. The proposed methodology successfully generates a well distributed family of Pareto optimal architecture solutions for the EHM system, which provides valuable insights into the design trade-offs. Uncertainty analysis is implemented using an efficient polynomial chaos approach and robust architecture solutions are obtained for the EHM system architecture design. Performance assessment through evaluation of benchmark test metrics demonstrates the superior performance of the proposed methodology

    A Harris Hawks Optimization Based Single- and Multi-Objective Optimal Power Flow Considering Environmental Emission

    Get PDF
    The electric sector is majorly concerned about the greenhouse and non-greenhouse gas emissions generated from both conventional and renewable energy sources, as this is becoming a major issue globally. Thus, the utilities must adhere to certain environmental guidelines for sustainable power generation. Therefore, this paper presents a novel nature-inspired and population-based Harris Hawks Optimization (HHO) methodology for controlling the emissions from thermal generating sources by solving single and multi-objective Optimal Power Flow (OPF) problems. The OPF is a non-linear, non-convex, constrained optimization problem that primarily aims to minimize the fitness function by satisfying the equality and inequality constraints of the system. The cooperative behavior and dynamic chasing patterns of hawks to pounce on escaping prey is modeled mathematically to minimize the objective function. In this paper, fuel cost, real power loss and environment emissions are regarded as single and multi-objective functions for optimal adjustments of power system control variables. The different conflicting framed multi-objective functions have been solved using weighted sums using a no-preference method. The presented method is coded using MATLAB software and an IEEE (Institute of Electrical and Electronics Engineers) 30-bus. The system was used to demonstrate the effectiveness of selective objectives. The obtained results are compared with the other Artificial Intelligence (AI) techniques such as the Whale Optimization Algorithm (WOA), the Salp Swarm Algorithm (SSA), Moth Flame (MF) and Glow Warm Optimization (GWO). Additionally, the study on placement of Distributed Generation (DG) reveals that the system losses and emissions are reduced by an amount of 9.8355% and 26.2%, respectively

    Quantifying dynamic sensitivity of optimization algorithm parameters to improve hydrological model calibration

    Get PDF
    It is widely recognized that optimization algorithm parameters have significant impacts on algorithm performance, but quantifying the influence is very complex and difficult due to high computational demands and dynamic nature of search parameters. The overall aim of this paper is to develop a global sensitivity analysis based framework to dynamically quantify the individual and interactive influence of algorithm parameters on algorithm performance. A variance decomposition sensitivity analysis method, Analysis of Variance (ANOVA), is used for sensitivity quantification, because it is capable of handling small samples and more computationally efficient compared with other approaches. The Shuffled Complex Evolution method developed at the University of Arizona algorithm (SCE-UA) is selected as an optimization algorithm for investigation, and two criteria, i.e., convergence speed and success rate, are used to measure the performance of SCE-UA. Results show the proposed framework can effectively reveal the dynamic sensitivity of algorithm parameters in the search processes, including individual influences of parameters and their interactive impacts. Interactions between algorithm parameters have significant impacts on SCE-UA performance, which has not been reported in previous research. The proposed framework provides a means to understand the dynamics of algorithm parameter influence, and highlights the significance of considering interactive parameter influence to improve algorithm performance in the search processes.National Natural Science Foundation of ChinaChina Scholarship Counci

    Tailored parameter optimization methods for ordinary differential equation models with steady-state constraints

    Get PDF
    Background: Ordinary differential equation (ODE) models are widely used to describe (bio-)chemical and biological processes. To enhance the predictive power of these models, their unknown parameters are estimated from experimental data. These experimental data are mostly collected in perturbation experiments, in which the processes are pushed out of steady state by applying a stimulus. The information that the initial condition is a steady state of the unperturbed process provides valuable information, as it restricts the dynamics of the process and thereby the parameters. However, implementing steady-state constraints in the optimization often results in convergence problems. Results: In this manuscript, we propose two new methods for solving optimization problems with steady-state constraints. The first method exploits ideas from optimization algorithms on manifolds and introduces a retraction operator, essentially reducing the dimension of the optimization problem. The second method is based on the continuous analogue of the optimization problem. This continuous analogue is an ODE whose equilibrium points are the optima of the constrained optimization problem. This equivalence enables the use of adaptive numerical methods for solving optimization problems with steady-state constraints. Both methods are tailored to the problem structure and exploit the local geometry of the steady-state manifold and its stability properties. A parameterization of the steady-state manifold is not required. The efficiency and reliability of the proposed methods is evaluated using one toy example and two applications. The first application example uses published data while the second uses a novel dataset for Raf/MEK/ERK signaling. The proposed methods demonstrated better convergence properties than state-of-the-art methods employed in systems and computational biology. Furthermore, the average computation time per converged start is significantly lower. In addition to the theoretical results, the analysis of the dataset for Raf/MEK/ERK signaling provides novel biological insights regarding the existence of feedback regulation. Conclusion: Many optimization problems considered in systems and computational biology are subject to steady-state constraints. While most optimization methods have convergence problems if these steady-state constraints are highly nonlinear, the methods presented recover the convergence properties of optimizers which can exploit an analytical expression for the parameter-dependent steady state. This renders them an excellent alternative to methods which are currently employed in systems and computational biology

    Risk based multi-objective security control and congestion management

    Get PDF
    Deterministic security criterion has served power system operation, congestion management quite well in last decades. It is simple to be implemented in a security control model, for example, security constrained optimal power flow (SCOPF). However, since event likelihood and violation information are not addressed, it does not provide quantitative security understanding, and so results in system inadequate awareness. Therefore, even if computation capability and information techniques have been greatly improved and widely applied in the operation support tool, operators are still not able to get rid of the security threat, especially in the market competitive environment.;Probability approach has shown its strong ability for planning purpose, and recently gets attention in operation area. Since power system security assessment needs to analyze consequence of all credible events, risk defined as multiplication of event probability and severity is well suited to give an indication to quantify the system security level, and congestion level as well. Since risk addresses extra information, its application for making BETTER online operation decision becomes an attractive research topic.;This dissertation focus on system online risk calculation, risk based multi-objective optimization model development, risk based security control design, and risk based congestion management. A regression model is proposed to predict contingency probability using weather and geography information for online risk calculation. Risk based multi-objective optimization (RBMO) model is presented, considering conflict objectives: risks and cost. Two types of method, classical methods and evolutionary algorithms, are implemented to solve RBMO problem, respectively. A risk based decision making architecture for security control is designed based on the Pareto-optimal solution understanding, visualization tool and high level information analysis. Risk based congestion management provides a market lever to uniformly expand a security VOLUME , where greater volume means more risk. Meanwhile, risk based LMP signal contracts ALL dimensions of this VOLUME in proper weights (state probabilities) at a time.;Two test systems, 6-bus and IEEE RTS 96, are used to test developed algorithms. The simulation results show that incorporating risk into security control and congestion management will evolve our understanding of security level, improve control and market efficiency, and support operator to maneuver system in an effective fashion

    Optimization of Thermo-mechanical Conditions in Friction Stir Welding

    Get PDF

    ADAPTIVE SEARCH AND THE PRELIMINARY DESIGN OF GAS TURBINE BLADE COOLING SYSTEMS

    Get PDF
    This research concerns the integration of Adaptive Search (AS) technique such as the Genetic Algorithms (GA) with knowledge based software to develop a research prototype of an Adaptive Search Manager (ASM). The developed approach allows to utilise both quantitative and qualitative information in engineering design decision making. A Fuzzy Expert System manipulates AS software within the design environment concerning the preliminary design of gas turbine blade cooling systems. Steady state cooling hole geometry models have been developed for the project in collaboration with Rolls Royce plc. The research prototype of ASM uses a hybrid of Adaptive Restricted Tournament Selection (ARTS) and Knowledge Based Hill Climbing (KBHC) to identify multiple "good" design solutions as potential design options. ARTS is a GA technique that is particularly suitable for real world problems having multiple sub-optima. KBHC uses information gathered during the ARTS search as well as information from the designer to perform a deterministic hill climbing. Finally, a local stochastic hill climbing fine tunes the "good" designs. Design solution sensitivity, design variable sensitivities and constraint sensitivities are calculated following Taguchi's methodology, which extracts sensitivity information with a very small number of model evaluations. Each potential design option is then qualitatively evaluated separately for manufacturability, choice of materials and some designer's special preferences using the knowledge of domain experts. In order to guarantee that the qualitative evaluation module can evaluate any design solution from the entire design space with a reasonably small number of rules, a novel knowledge representation technique is developed. The knowledge is first separated in three categories: inter-variable knowledge, intra-variable knowledge and heuristics. Inter-variable knowledge and intra-variable knowledge are then integrated using a concept of compromise. Information about the "good" design solutions is presented to the designer through a designer's interface for decision support.Rolls Royce plc., Bristol (UK
    corecore