62,644 research outputs found
A Hybrid Differential Evolution Approach to Designing Deep Convolutional Neural Networks for Image Classification
Convolutional Neural Networks (CNNs) have demonstrated their superiority in
image classification, and evolutionary computation (EC) methods have recently
been surging to automatically design the architectures of CNNs to save the
tedious work of manually designing CNNs. In this paper, a new hybrid
differential evolution (DE) algorithm with a newly added crossover operator is
proposed to evolve the architectures of CNNs of any lengths, which is named
DECNN. There are three new ideas in the proposed DECNN method. Firstly, an
existing effective encoding scheme is refined to cater for variable-length CNN
architectures; Secondly, the new mutation and crossover operators are developed
for variable-length DE to optimise the hyperparameters of CNNs; Finally, the
new second crossover is introduced to evolve the depth of the CNN
architectures. The proposed algorithm is tested on six widely-used benchmark
datasets and the results are compared to 12 state-of-the-art methods, which
shows the proposed method is vigorously competitive to the state-of-the-art
algorithms. Furthermore, the proposed method is also compared with a method
using particle swarm optimisation with a similar encoding strategy named IPPSO,
and the proposed DECNN outperforms IPPSO in terms of the accuracy.Comment: Accepted by The Australasian Joint Conference on Artificial
Intelligence 201
Bat Algorithm: Literature Review and Applications
Bat algorithm (BA) is a bio-inspired algorithm developed by Yang in 2010 and
BA has been found to be very efficient. As a result, the literature has
expanded significantly in the last 3 years. This paper provides a timely review
of the bat algorithm and its new variants. A wide range of diverse applications
and case studies are also reviewed and summarized briefly here. Further
research topics are also discussed.Comment: 10 page
Recommended from our members
Finding High-Dimensional D-OptimalDesigns for Logistic Models via Differential Evolution
D-optimal designs are frequently used in controlled experiments to obtain the most accurateestimate of model parameters at minimal cost. Finding them can be a challenging task, especially whenthere are many factors in a nonlinear model. As the number of factors becomes large and interact withone another, there are many more variables to optimize and the D-optimal design problem becomes highdimensionaland non-separable. Consequently, premature convergence issues arise. Candidate solutions gettrapped in local optima and the classical gradient-based optimization approaches to search for the D-optimaldesigns rarely succeed. We propose a specially designed version of differential evolution (DE) which is arepresentative gradient-free optimization approach to solve such high-dimensional optimization problems.The proposed specially designed DE uses a new novelty-based mutation strategy to explore the variousregions in the search space. The exploration of the regions will be carried out differently from the previouslyexplored regions and the diversity of the population can be preserved. The proposed novelty-based mutationstrategy is collaborated with two common DE mutation strategies to balance exploration and exploitationat the early or medium stage of the evolution. Additionally, we adapt the control parameters of DE as theevolution proceeds. Using logistic models with several factors on various design spaces as examples, oursimulation results show our algorithm can find D-optimal designs efficiently and the algorithm outperformsits competitors. As an application, we apply our algorithm and re-design a 10-factor car refueling experimentwith discrete and continuous factors and selected pairwise interactions. Our proposed algorithm was able toconsistently outperform the other algorithms and find a more efficient D-optimal design for the problem
Metaheuristic design of feedforward neural networks: a review of two decades of research
Over the past two decades, the feedforward neural network (FNN) optimization has been a key interest among the researchers and practitioners of multiple disciplines. The FNN optimization is often viewed from the various perspectives: the optimization of weights, network architecture, activation nodes, learning parameters, learning environment, etc. Researchers adopted such different viewpoints mainly to improve the FNN's generalization ability. The gradient-descent algorithm such as backpropagation has been widely applied to optimize the FNNs. Its success is evident from the FNN's application to numerous real-world problems. However, due to the limitations of the gradient-based optimization methods, the metaheuristic algorithms including the evolutionary algorithms, swarm intelligence, etc., are still being widely explored by the researchers aiming to obtain generalized FNN for a given problem. This article attempts to summarize a broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches. This article also tries to connect various research directions emerged out of the FNN optimization practices, such as evolving neural network (NN), cooperative coevolution NN, complex-valued NN, deep learning, extreme learning machine, quantum NN, etc. Additionally, it provides interesting research challenges for future research to cope-up with the present information processing era
- …