53 research outputs found
Fully Parallel Hyperparameter Search: Reshaped Space-Filling
Space-filling designs such as scrambled-Hammersley, Latin Hypercube Sampling
and Jittered Sampling have been proposed for fully parallel hyperparameter
search, and were shown to be more effective than random or grid search. In this
paper, we show that these designs only improve over random search by a constant
factor. In contrast, we introduce a new approach based on reshaping the search
distribution, which leads to substantial gains over random search, both
theoretically and empirically. We propose two flavors of reshaping. First, when
the distribution of the optimum is some known , we propose Recentering,
which uses as search distribution a modified version of tightened closer
to the center of the domain, in a dimension-dependent and budget-dependent
manner. Second, we show that in a wide range of experiments with unknown,
using a proposed Cauchy transformation, which simultaneously has a heavier tail
(for unbounded hyperparameters) and is closer to the boundaries (for bounded
hyperparameters), leads to improved performances. Besides artificial
experiments and simple real world tests on clustering or Salmon mappings, we
check our proposed methods on expensive artificial intelligence tasks such as
attend/infer/repeat, video next frame segmentation forecasting and progressive
generative adversarial networks
Noisy Optimization: Convergence with a Fixed Number of Resamplings
It is known that evolution strategies in continuous domains might not
converge in the presence of noise. It is also known that, under mild
assumptions, and using an increasing number of resamplings, one can mitigate
the effect of additive noise and recover convergence. We show new sufficient
conditions for the convergence of an evolutionary algorithm with constant
number of resamplings; in particular, we get fast rates (log-linear
convergence) provided that the variance decreases around the optimum slightly
faster than in the so-called multiplicative noise model. Keywords: Noisy
optimization, evolutionary algorithm, theory.Comment: EvoStar (2014
Optimal estimation for Large-Eddy Simulation of turbulence and application to the analysis of subgrid models
The tools of optimal estimation are applied to the study of subgrid models
for Large-Eddy Simulation of turbulence. The concept of optimal estimator is
introduced and its properties are analyzed in the context of applications to a
priori tests of subgrid models. Attention is focused on the Cook and Riley
model in the case of a scalar field in isotropic turbulence. Using DNS data,
the relevance of the beta assumption is estimated by computing (i) generalized
optimal estimators and (ii) the error brought by this assumption alone. Optimal
estimators are computed for the subgrid variance using various sets of
variables and various techniques (histograms and neural networks). It is shown
that optimal estimators allow a thorough exploration of models. Neural networks
are proved to be relevant and very efficient in this framework, and further
usages are suggested
Direct model predictive control: A theoretical and numerical analysis
This paper focuses on online control policies applied to power systems management. In this study, the power system problem is formulated as a stochastic decision process with large constrained action space, high stochasticity and dozens of state variables. Direct Model Predictive Control has previously been proposed to encompass a large class of stochastic decision making problems. It is a hybrid model which merges the properties of two different dynamic optimization methods, Model Predictive Control and Stochastic Dual Dynamic Programming. In this paper, we prove that Direct Model Predictive Control reaches an optimal policy for a wider class of decision processes than those solved by Model Predictive Control (suboptimal by nature), Stochastic Dynamic Programming (which needs a moderate size of state space) or Stochastic Dual Dynamic Programming (which requires convexity of Bellman values and a moderate complexity of the random value state). The algorithm is tested on a multiple-battery management problem and two hydroelectric problems. Direct Model Predictive Control clearly outperforms Model Predictive Control on the tested problems. © 2018 Power Systems Computation Conference
The Hessian Estimation Evolution Strategy
We present a novel black box optimization algorithm called Hessian Estimation
Evolution Strategy. The algorithm updates the covariance matrix of its sampling
distribution by directly estimating the curvature of the objective function.
This algorithm design is targeted at twice continuously differentiable
problems. For this, we extend the cumulative step-size adaptation algorithm of
the CMA-ES to mirrored sampling. We demonstrate that our approach to covariance
matrix adaptation is efficient by evaluation it on the BBOB/COCO testbed. We
also show that the algorithm is surprisingly robust when its core assumption of
a twice continuously differentiable objective function is violated. The
approach yields a new evolution strategy with competitive performance, and at
the same time it also offers an interesting alternative to the usual covariance
matrix update mechanism
An overview of population-based algorithms for multi-objective optimisation
In this work we present an overview of the most prominent population-based algorithms and the methodologies used to extend them to multiple objective problems. Although not exact in the mathematical sense, it has long been recognised that population-based multi-objective optimisation techniques for real-world applications are immensely valuable and versatile. These techniques are usually employed when exact optimisation methods are not easily applicable or simply when, due to sheer complexity, such techniques could potentially be very costly. Another advantage is that since a population of decision vectors is considered in each generation these algorithms are implicitly parallelisable and can generate an approximation of the entire Pareto front at each iteration. A critique of their capabilities is also provided
Unbiased Black-Box Complexities of Jump Functions
International audienc
Creating an Upper-Confidence-Tree program for Havannah
... huge improvements in computer-Go. In this paper, we test the generality of the approach by experimenting on another game, Havannah, which is known for being especially difficult for computers. We show that the same results hold, with slight differences related to the absence of clearly known patterns for the game of Havannah, in spite of the fact that Havannah is more related to connection games like Hex than to territory games like Go
- …