6,023 research outputs found
Recommended from our members
Finding High-Dimensional D-OptimalDesigns for Logistic Models via Differential Evolution
D-optimal designs are frequently used in controlled experiments to obtain the most accurateestimate of model parameters at minimal cost. Finding them can be a challenging task, especially whenthere are many factors in a nonlinear model. As the number of factors becomes large and interact withone another, there are many more variables to optimize and the D-optimal design problem becomes highdimensionaland non-separable. Consequently, premature convergence issues arise. Candidate solutions gettrapped in local optima and the classical gradient-based optimization approaches to search for the D-optimaldesigns rarely succeed. We propose a specially designed version of differential evolution (DE) which is arepresentative gradient-free optimization approach to solve such high-dimensional optimization problems.The proposed specially designed DE uses a new novelty-based mutation strategy to explore the variousregions in the search space. The exploration of the regions will be carried out differently from the previouslyexplored regions and the diversity of the population can be preserved. The proposed novelty-based mutationstrategy is collaborated with two common DE mutation strategies to balance exploration and exploitationat the early or medium stage of the evolution. Additionally, we adapt the control parameters of DE as theevolution proceeds. Using logistic models with several factors on various design spaces as examples, oursimulation results show our algorithm can find D-optimal designs efficiently and the algorithm outperformsits competitors. As an application, we apply our algorithm and re-design a 10-factor car refueling experimentwith discrete and continuous factors and selected pairwise interactions. Our proposed algorithm was able toconsistently outperform the other algorithms and find a more efficient D-optimal design for the problem
Optimal Parameter Choices Through Self-Adjustment: Applying the 1/5-th Rule in Discrete Settings
While evolutionary algorithms are known to be very successful for a broad
range of applications, the algorithm designer is often left with many
algorithmic choices, for example, the size of the population, the mutation
rates, and the crossover rates of the algorithm. These parameters are known to
have a crucial influence on the optimization time, and thus need to be chosen
carefully, a task that often requires substantial efforts. Moreover, the
optimal parameters can change during the optimization process. It is therefore
of great interest to design mechanisms that dynamically choose best-possible
parameters. An example for such an update mechanism is the one-fifth success
rule for step-size adaption in evolutionary strategies. While in continuous
domains this principle is well understood also from a mathematical point of
view, no comparable theory is available for problems in discrete domains.
In this work we show that the one-fifth success rule can be effective also in
discrete settings. We regard the ~GA proposed in
[Doerr/Doerr/Ebel: From black-box complexity to designing new genetic
algorithms, TCS 2015]. We prove that if its population size is chosen according
to the one-fifth success rule then the expected optimization time on
\textsc{OneMax} is linear. This is better than what \emph{any} static
population size can achieve and is asymptotically optimal also among
all adaptive parameter choices.Comment: This is the full version of a paper that is to appear at GECCO 201
Use of Statistical Outlier Detection Method in Adaptive\ud Evolutionary Algorithms
In this paper, the issue of adapting probabilities for Evolutionary Algorithm (EA) search operators is revisited. A framework is devised for distinguishing between measurements of performance and the interpretation of those measurements for purposes of adaptation. Several examples of measurements and statistical interpretations are provided. Probability value adaptation is tested using an EA with 10 search operators against 10 test problems with results indicating that both the type of measurement and its statistical interpretation play significant roles in EA performance. We also find that selecting operators based on the prevalence of outliers rather than on average performance is able to provide considerable improvements to\ud
adaptive methods and soundly outperforms the non-adaptive\ud
case
Use of statistical outlier detection method in adaptive evolutionary algorithms
In this paper, the issue of adapting probabilities for Evolutionary Algorithm
(EA) search operators is revisited. A framework is devised for distinguishing
between measurements of performance and the interpretation of those
measurements for purposes of adaptation. Several examples of measurements and
statistical interpretations are provided. Probability value adaptation is
tested using an EA with 10 search operators against 10 test problems with
results indicating that both the type of measurement and its statistical
interpretation play significant roles in EA performance. We also find that
selecting operators based on the prevalence of outliers rather than on average
performance is able to provide considerable improvements to adaptive methods
and soundly outperforms the non-adaptive case
First Steps Towards a Runtime Analysis of Neuroevolution
We consider a simple setting in neuroevolution where an evolutionary
algorithm optimizes the weights and activation functions of a simple artificial
neural network. We then define simple example functions to be learned by the
network and conduct rigorous runtime analyses for networks with a single neuron
and for a more advanced structure with several neurons and two layers. Our
results show that the proposed algorithm is generally efficient on two example
problems designed for one neuron and efficient with at least constant
probability on the example problem for a two-layer network. In particular, the
so-called harmonic mutation operator choosing steps of size with
probability proportional to turns out as a good choice for the underlying
search space. However, for the case of one neuron, we also identify situations
with hard-to-overcome local optima. Experimental investigations of our
neuroevolutionary algorithm and a state-of-the-art CMA-ES support the
theoretical findings.Comment: 27 pages; full version of paper published at FOGA 2023 and available
at AC
08051 Abstracts Collection -- Theory of Evolutionary Algorithms
From Jan. 27, 2008 to Feb. 1, 2008, the Dagstuhl Seminar 08051 ``Theory of Evolutionary Algorithms\u27\u27 was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl.
During the seminar, several participants presented their current
research, and ongoing work and open problems were discussed. Abstracts of
the presentations given during the seminar as well as abstracts of
seminar results and ideas are put together in this paper. The first section
describes the seminar topics and goals in general.
Links to extended abstracts or full papers are provided, if available
- …