1 research outputs found
Sampled Training and Node Inheritance for Fast Evolutionary Neural Architecture Search
The performance of a deep neural network is heavily dependent on its
architecture and various neural architecture search strategies have been
developed for automated network architecture design. Recently, evolutionary
neural architecture search (ENAS) has received increasing attention due to the
attractive global optimization capability of evolutionary algorithms. However,
ENAS suffers from extremely high computation costs because a large number of
performance evaluations is usually required in evolutionary optimization and
training deep neural networks is itself computationally very intensive. To
address this issue, this paper proposes a new evolutionary framework for fast
ENAS based on directed acyclic graph, in which parents are randomly sampled and
trained on each mini-batch of training data. In addition, a node inheritance
strategy is adopted to generate offspring individuals and their fitness is
directly evaluated without training. To enhance the feature processing
capability of the evolved neural networks, we also encode a channel attention
mechanism in the search space. We evaluate the proposed algorithm on the widely
used datasets, in comparison with 26 state-of-the-art peer algorithms. Our
experimental results show the proposed algorithm is not only computationally
much more efficiently, but also highly competitive in learning performance.Comment: 14 pages, 9 figure