4,752 research outputs found

    A Global Optimisation Toolbox for Massively Parallel Engineering Optimisation

    Full text link
    A software platform for global optimisation, called PaGMO, has been developed within the Advanced Concepts Team (ACT) at the European Space Agency, and was recently released as an open-source project. PaGMO is built to tackle high-dimensional global optimisation problems, and it has been successfully used to find solutions to real-life engineering problems among which the preliminary design of interplanetary spacecraft trajectories - both chemical (including multiple flybys and deep-space maneuvers) and low-thrust (limited, at the moment, to single phase trajectories), the inverse design of nano-structured radiators and the design of non-reactive controllers for planetary rovers. Featuring an arsenal of global and local optimisation algorithms (including genetic algorithms, differential evolution, simulated annealing, particle swarm optimisation, compass search, improved harmony search, and various interfaces to libraries for local optimisation such as SNOPT, IPOPT, GSL and NLopt), PaGMO is at its core a C++ library which employs an object-oriented architecture providing a clean and easily-extensible optimisation framework. Adoption of multi-threaded programming ensures the efficient exploitation of modern multi-core architectures and allows for a straightforward implementation of the island model paradigm, in which multiple populations of candidate solutions asynchronously exchange information in order to speed-up and improve the optimisation process. In addition to the C++ interface, PaGMO's capabilities are exposed to the high-level language Python, so that it is possible to easily use PaGMO in an interactive session and take advantage of the numerous scientific Python libraries available.Comment: To be presented at 'ICATT 2010: International Conference on Astrodynamics Tools and Techniques

    A derivative-free filter driven multistart technique for global optimization

    Get PDF
    A stochastic global optimization method based on a multistart strategy and a derivative-free filter local search for general constrained optimization is presented and analyzed. In the local search procedure, approximate descent directions for the constraint violation or the objective function are used to progress towards the optimal solution. The algorithm is able to locate all the local minima, and consequently, the global minimum of a multi-modal objective function. The performance of the multistart method is analyzed with a set of benchmark problems and a comparison is made with other methods.This work was financed by FEDER funds through COMPETE-Programa Operacional Fatores de Competitividade and by portuguese funds through FCT-Fundação para a Ciência e a Tecnologia within projects PEst-C/MAT/UI0013/2011 and FCOMP- 01-0124-FEDER-022674

    An artificial fish swarm filter-based Method for constrained global optimization

    Get PDF
    Ana Maria A.C. Rocha, M. Fernanda P. Costa and Edite M.G.P. Fernandes, An Artificial Fish Swarm Filter-Based Method for Constrained Global Optimization, B. Murgante, O. Gervasi, S. Mirsa, N. Nedjah, A.M. Rocha, D. Taniar, B. Apduhan (Eds.), Lecture Notes in Computer Science, Part III, LNCS 7335, pp. 57–71, Springer, Heidelberg, 2012.An artificial fish swarm algorithm based on a filter methodology for trial solutions acceptance is analyzed for general constrained global optimization problems. The new method uses the filter set concept to accept, at each iteration, a population of trial solutions whenever they improve constraint violation or objective function, relative to the current solutions. The preliminary numerical experiments with a wellknown benchmark set of engineering design problems show the effectiveness of the proposed method.Fundação para a Ciência e a Tecnologia (FCT

    On the Use of Surrogate Functions for Mixed Variable Optimization of Simulated Systems

    Get PDF
    This research considers the efficient numerical solution of linearly constrained mixed variable programming (MVP) problems, in which the objective function is a black-box stochastic simulation, function evaluations may be computationally expensive, and derivative information is typically not available. MVP problems are those with a mixture of continuous, integer, and categorical variables, the latter of which may take on values only from a predefined list and may even be non-numeric. Mixed Variable Generalized Pattern Search with Ranking and Selection (MGPS-RS) is the only existing, provably convergent algorithm that can be applied to this class of problems. Present in this algorithm is an optional framework for constructing and managing less expensive surrogate functions as a means to reduce the number of true function evaluations that are required to find approximate solutions. In this research, the NOMADm software package, an implementation of pattern search for deterministic MVP problems, is modified to incorporate a sequential selection with memory (SSM) ranking and selection procedure for handling stochastic problems. In doing so, the underlying algorithm is modified to make the application of surrogates more efficient. A second class of surrogates based on the Nadaraya-Watson kernel regression estimator is also added to the software. Preliminary computational testing of the modified software is performed to characterize the relative efficiency of selected surrogate functions for mixed variable optimization in simulated systems

    A hybrid genetic pattern search augmented Lagrangian method for constrained global optimization

    Get PDF
    Hybridization of genetic algorithms with local search approaches can enhance their performance in global optimization. Genetic algorithms, as most population based algorithms, require a considerable number of function evaluations. This may be an important drawback when the functions involved in the problem are computationally expensive as it occurs in most real world problems. Thus, in order to reduce the total number of function evaluations, local and global techniques may be combined. Moreover, the hybridization may provide a more effective trade-off between exploitation and exploration of the search space. In this study, we propose a new hybrid genetic algorithm based on a local pattern search that relies on an augmented Lagrangian function for constraint-handling. The local search strategy is used to improve the best approximation found by the genetic algorithm. Convergence to an ε\varepsilon-global minimizer is proved. Numerical results and comparisons with other stochastic algorithms using a set of benchmark constrained problems are provided.FEDER COMPETEFundação para a Ciência e a Tecnologia (FCT

    Comparative study of penalty simulated annealing methods for multiglobal programming

    Get PDF
    In a multiglobal optimization problem we aim to find all the global solutions of a constrained nonlinear programming problem where the objective function is multimodal. This class of global optimization problems is very important and frequently encountered in engineering applications, such as, process synthesis, design and control in chemical engineering. The most common method for solving this type of problems uses a local search method to refine a set of approximations, which are obtained by comparing objective function values at points of a predefined mesh. This type of method can be very expensive numerically. On the other hand, the success of local search methods depends on the starting point being at the neighbourhood of a solution. Stochastic methods are appropriate alternatives to find global solutions, in which convergence to a global solution can be guaranteed, with probability one. This is the case of the simulated annealing (SA) method. To compute the multiple solutions, a function stretching technique that transforms the objective function at each step is herein combined with SA to be able to force, step by step, convergence to each one of the required global solutions. The constraints of the problem are dealt with a penalty technique. This technique transforms the constrained problem into a sequence of unconstrained problems by penalizing the objective function when constraints are violated. Numerical experiments are shown with three penalty functions

    Hybridizing the electromagnetism-like algorithm with descent search for solving engineering design problems

    Get PDF
    In this paper, we present a new stochastic hybrid technique for constrained global optimization. It is a combination of the electromagnetism-like (EM) mechanism with a random local search, which is a derivative-free procedure with high ability of producing a descent direction. Since the original EM algorithm is specifically designed for solving bound constrained problems, the approach herein adopted for handling the inequality constraints of the problem relies on selective conditions that impose a sufficient reduction either in the constraints violation or in the objective function value, when comparing two points at a time. The hybrid EM method is tested on a set of benchmark engineering design problems and the numerical results demonstrate the effectiveness of the proposed approach. A comparison with results from other stochastic methods is also included

    Consensus-based optimization and ensemble Kalman inversion for global optimization problems with constraints

    Get PDF
    We introduce a practical method for incorporating equality and inequality constraints in global optimization methods based on stochastic interacting particle systems, specifically consensus-based optimization (CBO) and ensemble Kalman inversion (EKI). Unlike other approaches in the literature, the method we propose does not constrain the dynamics to the feasible region of the state space at all times; the particles evolve in the full space, but are attracted towards the feasible set by means of a penalization term added to the objective function and, in the case of CBO, an additional relaxation drift. We study the properties of the method through the associated mean-field Fokker–Planck equation and demonstrate its performance in numerical experiments on several test problems
    • …
    corecore