68 research outputs found

    Active Bayesian Optimization: Minimizing Minimizer Entropy

    Full text link
    The ultimate goal of optimization is to find the minimizer of a target function.However, typical criteria for active optimization often ignore the uncertainty about the minimizer. We propose a novel criterion for global optimization and an associated sequential active learning strategy using Gaussian processes.Our criterion is the reduction of uncertainty in the posterior distribution of the function minimizer. It can also flexibly incorporate multiple global minimizers. We implement a tractable approximation of the criterion and demonstrate that it obtains the global minimizer accurately compared to conventional Bayesian optimization criteria

    Expected Improvement in Efficient Global Optimization Through Bootstrapped Kriging - Replaces CentER DP 2010-62

    Get PDF
    This article uses a sequentialized experimental design to select simulation input com- binations for global optimization, based on Kriging (also called Gaussian process or spatial correlation modeling); this Kriging is used to analyze the input/output data of the simulation model (computer code). This design and analysis adapt the clas- sic "expected improvement" (EI) in "efficient global optimization" (EGO) through the introduction of an unbiased estimator of the Kriging predictor variance; this estimator uses parametric bootstrapping. Classic EI and bootstrapped EI are com- pared through various test functions, including the six-hump camel-back and several Hartmann functions. These empirical results demonstrate that in some applications bootstrapped EI finds the global optimum faster than classic EI does; in general, however, the classic EI may be considered to be a robust global optimizer.Simulation;Optimization;Kriging;Bootstrap
    • …
    corecore