1,483 research outputs found
Optimization of Evolutionary Neural Networks Using Hybrid Learning Algorithms
Evolutionary artificial neural networks (EANNs) refer to a special class of
artificial neural networks (ANNs) in which evolution is another fundamental
form of adaptation in addition to learning. Evolutionary algorithms are used to
adapt the connection weights, network architecture and learning algorithms
according to the problem environment. Even though evolutionary algorithms are
well known as efficient global search algorithms, very often they miss the best
local solutions in the complex solution space. In this paper, we propose a
hybrid meta-heuristic learning approach combining evolutionary learning and
local search methods (using 1st and 2nd order error information) to improve the
learning and faster convergence obtained using a direct evolutionary approach.
The proposed technique is tested on three different chaotic time series and the
test results are compared with some popular neuro-fuzzy systems and a recently
developed cutting angle method of global optimization. Empirical results reveal
that the proposed technique is efficient in spite of the computational
complexity
On the Trade-off Between Efficiency and Precision of Neural Abstraction
Neural abstractions have been recently introduced as formal approximations of
complex, nonlinear dynamical models. They comprise a neural ODE and a certified
upper bound on the error between the abstract neural network and the concrete
dynamical model. So far neural abstractions have exclusively been obtained as
neural networks consisting entirely of activation functions, resulting
in neural ODE models that have piecewise affine dynamics, and which can be
equivalently interpreted as linear hybrid automata. In this work, we observe
that the utility of an abstraction depends on its use: some scenarios might
require coarse abstractions that are easier to analyse, whereas others might
require more complex, refined abstractions. We therefore consider neural
abstractions of alternative shapes, namely either piecewise constant or
nonlinear non-polynomial (specifically, obtained via sigmoidal activations). We
employ formal inductive synthesis procedures to generate neural abstractions
that result in dynamical models with these semantics. Empirically, we
demonstrate the trade-off that these different neural abstraction templates
have vis-a-vis their precision and synthesis time, as well as the time required
for their safety verification (done via reachability computation). We improve
existing synthesis techniques to enable abstraction of higher-dimensional
models, and additionally discuss the abstraction of complex neural ODEs to
improve the efficiency of reachability analysis for these models.Comment: To appear at QEST 202
- …