33,197 research outputs found

    A clustering heuristic to improve a derivative-free algorithm for nonsmooth optimization

    Get PDF
    In this paper we propose an heuristic to improve the performances of the recently proposed derivative-free method for nonsmooth optimization CS-DFN. The heuristic is based on a clustering-type technique to compute an estimate of Clarke’s generalized gradient of the objective function, obtained via calculation of the (approximate) directional derivative along a certain set of directions. A search direction is then calculated by applying a nonsmooth Newton-type approach. As such, this direction (as it is shown by the numerical experiments) is a good descent direction for the objective function. We report some numerical results and comparison with the original CS-DFN method to show the utility of the proposed improvement on a set of well-known test problems

    A derivative-free filter driven multistart technique for global optimization

    Get PDF
    A stochastic global optimization method based on a multistart strategy and a derivative-free filter local search for general constrained optimization is presented and analyzed. In the local search procedure, approximate descent directions for the constraint violation or the objective function are used to progress towards the optimal solution. The algorithm is able to locate all the local minima, and consequently, the global minimum of a multi-modal objective function. The performance of the multistart method is analyzed with a set of benchmark problems and a comparison is made with other methods.This work was financed by FEDER funds through COMPETE-Programa Operacional Fatores de Competitividade and by portuguese funds through FCT-Fundação para a Ciência e a Tecnologia within projects PEst-C/MAT/UI0013/2011 and FCOMP- 01-0124-FEDER-022674

    Comparison between MGDA and PAES for Multi-Objective Optimization

    Get PDF
    In multi-objective optimization, the knowledge of the Pareto set provides valuable information on the reachable optimal performance. A number of evolutionary strategies (PAES, NSGA-II, etc), have been proposed in the literature and proved to be successful to identify the Pareto set. However, these derivative-free algorithms are very remanding in terms of computational time. Today, in many areas of computational sciences, codes are developed that include the calculation of the gradient, cautiously validated and calibrated. Thus, an alternate method applicable when the gradients are known is introduced here. Using a clever combination of the gradients, a descent direction common to all criteria is identified. As a natural outcome, the Multiple Gradient Descent Algorithm (MGDA) is defined as a generalization of steepest-descent method and compared with PAES by numerical experiments

    Comparison of Shape Derivatives Using CutFEM for Ill-posed Bernoulli Free Boundary Problem

    Get PDF
    In this paper we study and compare three types of shape derivatives for free boundary identification problems. The problem takes the form of a severely ill-posed Bernoulli problem where only the Dirichlet condition is given on the free (unknown) boundary, whereas both Dirichlet and Neumann conditions are available on the fixed (known) boundary. Our framework resembles the classical shape optimization method in which a shape dependent cost functional is minimized among the set of admissible domains. The position of the domain is defined implicitly by the level set function. The steepest descent method, based on the shape derivative, is applied for the level set evolution. For the numerical computation of the gradient, we apply the Cut Finite Element Method (CutFEM), that circumvents meshing and re-meshing, without loss of accuracy in the approximations of the involving partial differential models. We consider three different shape derivatives. The first one is the classical shape derivative based on the cost functional with pde constraints defined on the continuous level. The second shape derivative is similar but using a discretized cost functional that allows for the embedding of CutFEM formulations directly in the formulation. Different from the first two methods, the third shape derivative is based on a discrete formulation where perturbations of the domain are built into the variational formulation on the unperturbed domain. This is realized by using the so-called boundary value correction method that was originally introduced to allow for high order approximations to be realized using low order approximation of the domain. The theoretical discussion is illustrated with a series of numerical examples showing that all three approaches produce similar result on the proposed Bernoulli problem

    Hybridizing the electromagnetism-like algorithm with descent search for solving engineering design problems

    Get PDF
    In this paper, we present a new stochastic hybrid technique for constrained global optimization. It is a combination of the electromagnetism-like (EM) mechanism with a random local search, which is a derivative-free procedure with high ability of producing a descent direction. Since the original EM algorithm is specifically designed for solving bound constrained problems, the approach herein adopted for handling the inequality constraints of the problem relies on selective conditions that impose a sufficient reduction either in the constraints violation or in the objective function value, when comparing two points at a time. The hybrid EM method is tested on a set of benchmark engineering design problems and the numerical results demonstrate the effectiveness of the proposed approach. A comparison with results from other stochastic methods is also included

    Comparison between two multi objective optimization algorithms : PAES and MGDA. Testing MGDA on Kriging metamodels

    Get PDF
    Book dedicated to Professor P. Neittaanmaki on His 60th BithdayInternational audienceIn multi-objective optimization, the knowledge of the Pareto set provides valuable information on the reachable optimal performance. A number of evolutionary strategies (PAES [4], NSGA-II [3], etc), have been proposed in the literature and proved to be successful to identify the Pareto set. However, these derivative-free algorithms are very demanding in computational time. Today, in many areas of computational sciences, codes are developed that include the calculation of the gradient, cautiously validated and calibrated. Thus, an alternate method applicable when the gradients are known is introduced presently. Using a clever combination of the gradients, a descent direction common to all criteria is identified. As a natural outcome, the Multiple Gradient Descent Algorithm (MGDA) is defined as a generalization of the steepest-descent method and compared with PAES by numerical experiments. Using MGDA on a multi objective optimization problem requires the evaluation of a large number of points with regard to criteria, and their gradients. In the particular case of CFD problems, each point evaluation is very costly. Thus here we also propose to construct metamodels and to calculate approximate gradients by local finite differences

    The q-gradient method for global optimization

    Full text link
    The q-gradient is an extension of the classical gradient vector based on the concept of Jackson's derivative. Here we introduce a preliminary version of the q-gradient method for unconstrained global optimization. The main idea behind our approach is the use of the negative of the q-gradient of the objective function as the search direction. In this sense, the method here proposed is a generalization of the well-known steepest descent method. The use of Jackson's derivative has shown to be an effective mechanism for escaping from local minima. The q-gradient method is complemented with strategies to generate the parameter q and to compute the step length in a way that the search process gradually shifts from global in the beginning to almost local search in the end. For testing this new approach, we considered six commonly used test functions and compared our results with three Genetic Algorithms (GAs) considered effective in optimizing multidimensional unimodal and multimodal functions. For the multimodal test functions, the q-gradient method outperformed the GAs, reaching the minimum with a better accuracy and with less function evaluations.Comment: 12 pages, 1 figur
    • …
    corecore