429 research outputs found

    Extensions à l'algorithme de recherche directe mads pour l'optimisation non lisse

    Get PDF
    Revue de la littérature sur les méthodes de recherche directe pour l'optimisation non lisse -- Démarche et organisation de la thèse -- Nonsmooth optimization through mesh adaptive direct search and variable neighborhood search -- Parallel space decomposition of the mesh adaptive direct search algorithm -- Orthomads : a deterministic mads instance with orthogonal directions

    Derivative-free methods for mixed-integer nonsmooth constrained optimization

    Get PDF
    In this paper, we consider mixed-integer nonsmooth constrained optimization problems whose objective/constraint functions are available only as the output of a black-box zeroth-order oracle (i.e., an oracle that does not provide derivative information) and we propose a new derivative-free linesearch-based algorithmic framework to suitably handle those problems. We first describe a scheme for bound constrained problems that combines a dense sequence of directions (to handle the nonsmoothness of the objective function) with primitive directions (to handle discrete variables). Then, we embed an exact penalty approach in the scheme to suitably manage nonlinear (possibly nonsmooth) constraints. We analyze the global convergence properties of the proposed algorithms toward stationary points and we report the results of an extensive numerical experience on a set of mixed-integer test problems

    Optimal PMU Placement for Power System Dynamic State Estimation by Using Empirical Observability Gramian

    Get PDF
    In this paper the empirical observability Gramian calculated around the operating region of a power system is used to quantify the degree of observability of the system states under specific phasor measurement unit (PMU) placement. An optimal PMU placement method for power system dynamic state estimation is further formulated as an optimization problem which maximizes the determinant of the empirical observability Gramian and is efficiently solved by the NOMAD solver, which implements the Mesh Adaptive Direct Search (MADS) algorithm. The implementation, validation, and also the robustness to load fluctuations and contingencies of the proposed method are carefully discussed. The proposed method is tested on WSCC 3-machine 9-bus system and NPCC 48-machine 140-bus system by performing dynamic state estimation with square-root unscented Kalman filter. The simulation results show that the determined optimal PMU placements by the proposed method can guarantee good observability of the system states, which further leads to smaller estimation errors and larger number of convergent states for dynamic state estimation compared with random PMU placements. Under optimal PMU placements an obvious observability transition can be observed. The proposed method is also validated to be very robust to both load fluctuations and contingencies.Comment: Accepted by IEEE Transactions on Power System

    Parallel Space Decomposition of the Mesh Adaptive Direct Search Algorithm

    Get PDF
    This paper describes a Parallel Space Decomposition (PSD) technique for the Mesh Adaptive Direct Search (MADS) algorithm. MADS extends Generalized Pattern Search for constrained nonsmooth optimization problems. The objective here is to solve larger problems more efficiently. The new method (PSD-MADS) is an asynchronous parallel algorithm in which the processes solve problems over subsets of variables. The convergence analysis based on the Clarke calculus is essentially the same as for the MADS algorithm. A practical implementation is described and some numerical results on problems with up to 500 variables illustrate advantages and limitations of PSD-MADS

    Quantifying uncertainty with ensembles of surrogates for blackbox optimization

    Full text link
    This work is in the context of blackbox optimization where the functions defining the problem are expensive to evaluate and where no derivatives are available. A tried and tested technique is to build surrogates of the objective and the constraints in order to conduct the optimization at a cheaper computational cost. This work proposes different uncertainty measures when using ensembles of surrogates. The resulting combination of an ensemble of surrogates with our measures behaves as a stochastic model and allows the use of efficient Bayesian optimization tools. The method is incorporated in the search step of the mesh adaptive direct search (MADS) algorithm to improve the exploration of the search space. Computational experiments are conducted on seven analytical problems, two multi-disciplinary optimization problems and two simulation problems. The results show that the proposed approach solves expensive simulation-based problems at a greater precision and with a lower computational effort than stochastic models.Comment: 36 pages, 11 figures, submitte

    Contributions to the development of an integrated toolbox of solvers in Derivative-Free Optimization

    Get PDF
    This dissertation is framed on the ongoing research project BoostDFO - Improving the performance and moving to newer dimensions in Derivative-Free Optimization. The final goal of this project is to develop efficient and robust algorithms for Global and/or Multiobjective Derivative-free Optimization. This type of optimization is typically required in complex scientific/industrial applications, where the function evaluation is time-consuming and derivatives are not available for use, neither can be numerically approximated. Often problems present several conflicting objectives or users aspire to obtain global solutions. Inspired by successful approaches used in single objective local Derivative-free Optimization, we intend to address the inherent problem of the huge execution times by resorting to parallel/cloud computing and carrying a detailed performance analysis. As result, an integrated toolbox for solving single/multi objective, local/global Derivativefree Optimization problems is made available, with recommendations for taking advantage of parallelization and cloud computing, providing easy access to several efficient and robust algorithms and allowing to tackle harder Derivative-free Optimization problems.Esta dissertação insere-se no projecto científico BoostDFO - Improving the performance and moving to newer dimensions in Derivative-Free Optimization. O objectivo final desta investigação é desenvolver algoritmos robustos e eficientes para problemas de Optimização Sem Derivadas Globais e/ou Multiobjectivo. Este tipo de optimização é tipicamente requerido em aplicações científicas/industriais complexas, onde a avaliação da função é bastante demorada e as derivadas não se encontram disponíveis, nem podem ser aproximadas numericamente. Os problemas apresentam frequentemente vários objectivos divergentes ou os utilizadores procuram obter soluções globais. Tendo por base abordagens prévias bem-sucedidas utilizadas em Optimização Sem Derivadas local e uniobjectivo, pretende-se abordar o problema inerente aos grandes tempos de execução, recorrendo ao paralelismo/computação em cloud e efectuando uma detalhada análise de desempenho. Como resultado, é disponibilizada uma ferramenta integrada destinada a problemas de Optimização Sem Derivadas uni/multiobjectivo, com optimização local/global, incluindo recomendações que permitam tirar partido do paralelismo e computação em cloud, facilitando o acesso a vários algoritmos robustos e eficientes e permitindo abordar problemas mais difíceis nesta classe
    corecore