953 research outputs found
Memory based on abstraction for dynamic fitness functions
Copyright @ Springer-Verlag Berlin Heidelberg 2008.This paper proposes a memory scheme based on abstraction for evolutionary algorithms to address dynamic optimization problems. In this memory scheme, the memory does not store good solutions as themselves but as their abstraction, i.e., their approximate location in the search space. When the environment changes, the stored abstraction information is extracted to generate new individuals into the population. Experiments are carried out to validate the abstraction based memory scheme. The results show the efficiency of the abstraction based memory scheme for evolutionary algorithms in dynamic environments.This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) of UK under Grant No. EP/E060722/1
On replacement strategies in steady state evolutionary algorithms
Steady State models of Evolutionary Algorithms are widely used, yet surprisingly little attention has been paid to the effects arising from different replacement strategies. This paper explores the use of mathematical models to characterise the selection pressures arising in a selection-only environment. The first part brings together models for the behaviour of seven different replacement mechanisms and provides expressions for various proposed indicators of Evolutionary Algorithm behaviour. Some of these have been derived elsewhere, and are included for completeness, but the majority are new to this paper. These theoretical indicators are used to compare the behaviour of the different strategies. The second part of this paper examines the practical relevance of these indicators as predictors for algorithms' relative performance in terms of optimisation time and reliability. It is not the intention of this paper to come up with a "one size fits all" recommendation for choice of replacement strategy. Although some strategies may have little to recommend them, the relative ranking of others is shown to depend on the intended use of the algorithm to be implemented, as reflected in the choice of performance metrics. © 2007 by the Massachusetts Institute of Technology
Group Leaders Optimization Algorithm
We present a new global optimization algorithm in which the influence of the
leaders in social groups is used as an inspiration for the evolutionary
technique which is designed into a group architecture. To demonstrate the
efficiency of the method, a standard suite of single and multidimensional
optimization functions along with the energies and the geometric structures of
Lennard-Jones clusters are given as well as the application of the algorithm on
quantum circuit design problems. We show that as an improvement over previous
methods, the algorithm scales as N^2.5 for the Lennard-Jones clusters of
N-particles. In addition, an efficient circuit design is shown for two qubit
Grover search algorithm which is a quantum algorithm providing quadratic
speed-up over the classical counterpart
An island based hybrid evolutionary algorithm for optimization
This is a post-print version of the article - Copyright @ 2008 Springer-VerlagEvolutionary computation has become an important problem solving methodology among the set of search and optimization techniques. Recently, more and more different evolutionary techniques have been developed, especially hybrid evolutionary algorithms. This paper proposes an island based hybrid evolutionary algorithm (IHEA) for optimization, which is based on Particle swarm optimization (PSO), Fast Evolutionary Programming (FEP), and Estimation of Distribution Algorithm (EDA). Within IHEA, an island model is designed to cooperatively search for the global optima in search space. By combining the strengths of the three component algorithms, IHEA greatly improves the optimization performance of the three basic algorithms. Experimental results demonstrate that IHEA outperforms all the three component algorithms on the test problems.This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) of UK under Grant EP/E060722/1
Adaptive intelligence applied to numerical optimisation
The article presents modification strategies theoretical comparison and experimental results achieved by adaptive heuristics applied to numerical optimisation of several non-constraint test functions. The aims of the study are to identify and compare how adaptive search heuristics behave within heterogeneous search space without retuning of the search parameters. The achieved results are summarised and analysed, which could be used for comparison to other methods and further investigation
Application of quantum-inspired generative models to small molecular datasets
Quantum and quantum-inspired machine learning has emerged as a promising and
challenging research field due to the increased popularity of quantum
computing, especially with near-term devices. Theoretical contributions point
toward generative modeling as a promising direction to realize the first
examples of real-world quantum advantages from these technologies. A few
empirical studies also demonstrate such potential, especially when considering
quantum-inspired models based on tensor networks. In this work, we apply
tensor-network-based generative models to the problem of molecular discovery.
In our approach, we utilize two small molecular datasets: a subset of
molecules from the QM9 dataset and a small in-house dataset of validated
antioxidants from TotalEnergies. We compare several tensor network models
against a generative adversarial network using different sample-based metrics,
which reflect their learning performances on each task, and multiobjective
performances using relevant molecular metrics per task. We also combined
the output of the models and demonstrate empirically that such a combination
can be beneficial, advocating for the unification of classical and
quantum(-inspired) generative learning.Comment: First versio
- âŠ