311 research outputs found
A Minimal Developmental Model Can Increase Evolvability in Soft Robots
Different subsystems of organisms adapt over many time scales, such as rapid
changes in the nervous system (learning), slower morphological and neurological
change over the lifetime of the organism (postnatal development), and change
over many generations (evolution). Much work has focused on instantiating
learning or evolution in robots, but relatively little on development. Although
many theories have been forwarded as to how development can aid evolution, it
is difficult to isolate each such proposed mechanism. Thus, here we introduce a
minimal yet embodied model of development: the body of the robot changes over
its lifetime, yet growth is not influenced by the environment. We show that
even this simple developmental model confers evolvability because it allows
evolution to sweep over a larger range of body plans than an equivalent
non-developmental system, and subsequent heterochronic mutations 'lock in' this
body plan in more morphologically-static descendants. Future work will involve
gradually complexifying the developmental model to determine when and how such
added complexity increases evolvability
TensorFlow Enabled Genetic Programming
Genetic Programming, a kind of evolutionary computation and machine learning
algorithm, is shown to benefit significantly from the application of vectorized
data and the TensorFlow numerical computation library on both CPU and GPU
architectures. The open source, Python Karoo GP is employed for a series of 190
tests across 6 platforms, with real-world datasets ranging from 18 to 5.5M data
points. This body of tests demonstrates that datasets measured in tens and
hundreds of data points see 2-15x improvement when moving from the scalar/SymPy
configuration to the vector/TensorFlow configuration, with a single core
performing on par or better than multiple CPU cores and GPUs. A dataset
composed of 90,000 data points demonstrates a single vector/TensorFlow CPU core
performing 875x better than 40 scalar/Sympy CPU cores. And a dataset containing
5.5M data points sees GPU configurations out-performing CPU configurations on
average by 1.3x.Comment: 8 pages, 5 figures; presented at GECCO 2017, Berlin, German
Towards the Evolution of Multi-Layered Neural Networks: A Dynamic Structured Grammatical Evolution Approach
Current grammar-based NeuroEvolution approaches have several shortcomings. On
the one hand, they do not allow the generation of Artificial Neural Networks
(ANNs composed of more than one hidden-layer. On the other, there is no way to
evolve networks with more than one output neuron. To properly evolve ANNs with
more than one hidden-layer and multiple output nodes there is the need to know
the number of neurons available in previous layers. In this paper we introduce
Dynamic Structured Grammatical Evolution (DSGE): a new genotypic representation
that overcomes the aforementioned limitations. By enabling the creation of
dynamic rules that specify the connection possibilities of each neuron, the
methodology enables the evolution of multi-layered ANNs with more than one
output neuron. Results in different classification problems show that DSGE
evolves effective single and multi-layered ANNs, with a varying number of
output neurons
- …