79,806 research outputs found
Analysis of Noisy Evolutionary Optimization When Sampling Fails
In noisy evolutionary optimization, sampling is a common strategy to deal
with noise. By the sampling strategy, the fitness of a solution is evaluated
multiple times (called \emph{sample size}) independently, and its true fitness
is then approximated by the average of these evaluations. Previous studies on
sampling are mainly empirical. In this paper, we first investigate the effect
of sample size from a theoretical perspective. By analyzing the (1+1)-EA on the
noisy LeadingOnes problem, we show that as the sample size increases, the
running time can reduce from exponential to polynomial, but then return to
exponential. This suggests that a proper sample size is crucial in practice.
Then, we investigate what strategies can work when sampling with any fixed
sample size fails. By two illustrative examples, we prove that using parent or
offspring populations can be better. Finally, we construct an artificial noisy
example to show that when using neither sampling nor populations is effective,
adaptive sampling (i.e., sampling with an adaptive sample size) can work. This,
for the first time, provides a theoretical support for the use of adaptive
sampling
Uncertainty And Evolutionary Optimization: A Novel Approach
Evolutionary algorithms (EA) have been widely accepted as efficient solvers
for complex real world optimization problems, including engineering
optimization. However, real world optimization problems often involve uncertain
environment including noisy and/or dynamic environments, which pose major
challenges to EA-based optimization. The presence of noise interferes with the
evaluation and the selection process of EA, and thus adversely affects its
performance. In addition, as presence of noise poses challenges to the
evaluation of the fitness function, it may need to be estimated instead of
being evaluated. Several existing approaches attempt to address this problem,
such as introduction of diversity (hyper mutation, random immigrants, special
operators) or incorporation of memory of the past (diploidy, case based
memory). However, these approaches fail to adequately address the problem. In
this paper we propose a Distributed Population Switching Evolutionary Algorithm
(DPSEA) method that addresses optimization of functions with noisy fitness
using a distributed population switching architecture, to simulate a
distributed self-adaptive memory of the solution space. Local regression is
used in the pseudo-populations to estimate the fitness. Successful applications
to benchmark test problems ascertain the proposed method's superior performance
in terms of both robustness and accuracy.Comment: In Proceedings of the The 9th IEEE Conference on Industrial
Electronics and Applications (ICIEA 2014), IEEE Press, pp. 988-983, 201
Differentially Private Adaptive Optimization with Delayed Preconditioners
Privacy noise may negate the benefits of using adaptive optimizers in
differentially private model training. Prior works typically address this issue
by using auxiliary information (e.g., public data) to boost the effectiveness
of adaptive optimization. In this work, we explore techniques to estimate and
efficiently adapt to gradient geometry in private adaptive optimization without
auxiliary data. Motivated by the observation that adaptive methods can tolerate
stale preconditioners, we propose differentially private adaptive training with
delayed preconditioners (DP^2), a simple method that constructs delayed but
less noisy preconditioners to better realize the benefits of adaptivity.
Theoretically, we provide convergence guarantees for our method for both convex
and non-convex problems, and analyze trade-offs between delay and privacy noise
reduction. Empirically, we explore DP^2 across several real-world datasets,
demonstrating that it can improve convergence speed by as much as 4x relative
to non-adaptive baselines and match the performance of state-of-the-art
optimization methods that require auxiliary data.Comment: Accepted by ICLR 202
- …