126,510 research outputs found

    Particle algorithms for optimization on binary spaces

    Full text link
    We discuss a unified approach to stochastic optimization of pseudo-Boolean objective functions based on particle methods, including the cross-entropy method and simulated annealing as special cases. We point out the need for auxiliary sampling distributions, that is parametric families on binary spaces, which are able to reproduce complex dependency structures, and illustrate their usefulness in our numerical experiments. We provide numerical evidence that particle-driven optimization algorithms based on parametric families yield superior results on strongly multi-modal optimization problems while local search heuristics outperform them on easier problems

    A Bayesian approach to constrained single- and multi-objective optimization

    Get PDF
    This article addresses the problem of derivative-free (single- or multi-objective) optimization subject to multiple inequality constraints. Both the objective and constraint functions are assumed to be smooth, non-linear and expensive to evaluate. As a consequence, the number of evaluations that can be used to carry out the optimization is very limited, as in complex industrial design optimization problems. The method we propose to overcome this difficulty has its roots in both the Bayesian and the multi-objective optimization literatures. More specifically, an extended domination rule is used to handle objectives and constraints in a unified way, and a corresponding expected hyper-volume improvement sampling criterion is proposed. This new criterion is naturally adapted to the search of a feasible point when none is available, and reduces to existing Bayesian sampling criteria---the classical Expected Improvement (EI) criterion and some of its constrained/multi-objective extensions---as soon as at least one feasible point is available. The calculation and optimization of the criterion are performed using Sequential Monte Carlo techniques. In particular, an algorithm similar to the subset simulation method, which is well known in the field of structural reliability, is used to estimate the criterion. The method, which we call BMOO (for Bayesian Multi-Objective Optimization), is compared to state-of-the-art algorithms for single- and multi-objective constrained optimization

    The ADS general-purpose optimization program

    Get PDF
    The mathematical statement of the general nonlinear optimization problem is given as follows: find the vector of design variables, X, that will minimize f(X) subject to G sub J (x) + or - 0 j=1,m H sub K hk(X) = 0 k=1,l X Lower I approx less than X sub I approx. less than X U over I i = 1,N. The vector of design variables, X, includes all those variables which may be changed by the ADS program in order to arrive at the optimum design. The objective function F(X) to be minimized may be weight, cost or some other performance measure. If the objective is to be maximized, this is accomplished by minimizing -F(X). The inequality constraints include limits on stress, deformation, aeroelastic response or controllability, as examples, and may be nonlinear implicit functions of the design variables, X. The equality constraints h sub k(X) represent conditions that must be satisfied precisely for the design to be acceptable. Equality constraints are not fully operational in version 1.0 of the ADS program, although they are available in the Augmented Lagrange Multiplier method. The side constraints given by the last equation are used to directly limit the region of search for the optimum. The ADS program will never consider a design which is not within these limits
    corecore