5 research outputs found
Neuroevolution in Deep Neural Networks: Current Trends and Future Challenges
A variety of methods have been applied to the architectural configuration and
learning or training of artificial deep neural networks (DNN). These methods
play a crucial role in the success or failure of the DNN for most problems and
applications. Evolutionary Algorithms (EAs) are gaining momentum as a
computationally feasible method for the automated optimisation and training of
DNNs. Neuroevolution is a term which describes these processes of automated
configuration and training of DNNs using EAs. While many works exist in the
literature, no comprehensive surveys currently exist focusing exclusively on
the strengths and limitations of using neuroevolution approaches in DNNs.
Prolonged absence of such surveys can lead to a disjointed and fragmented field
preventing DNNs researchers potentially adopting neuroevolutionary methods in
their own research, resulting in lost opportunities for improving performance
and wider application within real-world deep learning problems. This paper
presents a comprehensive survey, discussion and evaluation of the
state-of-the-art works on using EAs for architectural configuration and
training of DNNs. Based on this survey, the paper highlights the most pertinent
current issues and challenges in neuroevolution and identifies multiple
promising future research directions.Comment: 20 pages (double column), 2 figures, 3 tables, 157 reference
The Effects of Constant Neutrality on Performance and Problem Hardness in GP
The neutral theory of molecular evolution and the associated notion
of neutrality have interested many researchers in Evolutionary Computation. The
hope is that the presence of neutrality can aid evolution. However, despite the
vast number of publications on neutrality, there is still a big controversy on its
effects. The aim of this paper is to clarify under what circumstances neutrality
could aid Genetic Programming using the traditional representation (i.e. tree-like
structures) . For this purpose, we use fitness distance correlation as a measure
of hardness. In addition we have conducted extensive empirical experimentation
to corroborate the fitness distance correlation predictions. This has been done
using two test problems with very different landscape features that represent two
extreme cases where the different effects of neutrality can be emphasised. Finally,
we study the distances between individuals and global optimum to understand
how neutrality affects evolution (at least with the one proposed in this paper)
R.: The Effects of Constant Neutrality on Performance and Problem Hardness in GP
Abstract. The neutral theory of molecular evolution and the associated notion of neutrality have interested many researchers in Evolutionary Computation. The hope is that the presence of neutrality can aid evolution. However, despite the vast number of publications on neutrality, there is still a big controversy on its effects. The aim of this paper is to clarify under what circumstances neutrality could aid Genetic Programming using the traditional representation (i.e. tree-like structures). For this purpose, we use fitness distance correlation as a measure of hardness. In addition we have conducted extensive empirical experimentation to corroborate the fitness distance correlation predictions. This has been done using two test problems with very different landscape features that represent two extreme cases where the different effects of neutrality can be emphasised. Finally, we study the distances between individuals and global optimum to understand how neutrality affects evolution (at least with the one proposed in this paper).