49,390 research outputs found
HOAX: A Hyperparameter Optimization Algorithm Explorer for Neural Networks
Computational chemistry has become an important tool to predict and
understand molecular properties and reactions. Even though recent years have
seen a significant growth in new algorithms and computational methods that
speed up quantum chemical calculations, the bottleneck for trajectory-based
methods to study photoinduced processes is still the huge number of electronic
structure calculations. In this work, we present an innovative solution, in
which the amount of electronic structure calculations is drastically reduced,
by employing machine learning algorithms and methods borrowed from the realm of
artificial intelligence. However, applying these algorithms effectively
requires finding optimal hyperparameters, which remains a challenge itself.
Here we present an automated user-friendly framework, HOAX, to perform the
hyperparameter optimization for neural networks, which bypasses the need for a
lengthy manual process. The neural network generated potential energy surfaces
(PESs) reduces the computational costs compared to the ab initio-based PESs. We
perform a comparative investigation on the performance of different
hyperparameter optimiziation algorithms, namely grid search, simulated
annealing, genetic algorithm, and bayesian optimizer in finding the optimal
hyperparameters necessary for constructing the well-performing neural network
in order to fit the PESs of small organic molecules. Our results show that this
automated toolkit not only facilitate a straightforward way to perform the
hyperparameter optimization but also the resulting neural networks-based
generated PESs are in reasonable agreement with the ab initio-based PESs.Comment: 18 page
Optimization of Evolutionary Neural Networks Using Hybrid Learning Algorithms
Evolutionary artificial neural networks (EANNs) refer to a special class of
artificial neural networks (ANNs) in which evolution is another fundamental
form of adaptation in addition to learning. Evolutionary algorithms are used to
adapt the connection weights, network architecture and learning algorithms
according to the problem environment. Even though evolutionary algorithms are
well known as efficient global search algorithms, very often they miss the best
local solutions in the complex solution space. In this paper, we propose a
hybrid meta-heuristic learning approach combining evolutionary learning and
local search methods (using 1st and 2nd order error information) to improve the
learning and faster convergence obtained using a direct evolutionary approach.
The proposed technique is tested on three different chaotic time series and the
test results are compared with some popular neuro-fuzzy systems and a recently
developed cutting angle method of global optimization. Empirical results reveal
that the proposed technique is efficient in spite of the computational
complexity
- …