CORE
🇺🇦
make metadata, not war
Services
Services overview
Explore all CORE services
Access to raw data
API
Dataset
FastSync
Content discovery
Recommender
Discovery
OAI identifiers
OAI Resolver
Managing content
Dashboard
Bespoke contracts
Consultancy services
Support us
Support us
Membership
Sponsorship
Community governance
Advisory Board
Board of supporters
Research network
About
About us
Our mission
Team
Blog
FAQs
Contact us
Use of backpropagation and differential evolution algorithms to training MLPs
Authors
LC Camargo
ATR Pozo
HC Tissot
Publication date
6 January 2014
Publisher
'Institute of Electrical and Electronics Engineers (IEEE)'
Abstract
Artificial Neural Networks (ANNs) are often used (trained) to find a general solution in problems where a pattern needs to be extracted, such as data classification. Feedforward (FFNN) is one of the ANN architectures and multilayer perceptron (MLP) is a type of FFNN. Based on gradient descent, backpropagation (BP) is one of the most used algorithms for MLP training. Evolutionary algorithms can be also used to train MLPs, including Differential Evolution (DE) algorithm. In this paper, BP and DE are used to train MLPs and they are both compared in four different approaches: (a) backpropagation, (b) DE with fixed parameter values, (c) DE with adaptive parameter values and (d) a hybrid alternative using both DE+BP algorithms. © 2013 IEEE
Similar works
Full text
Open in the Core reader
Download PDF
Available Versions
UCL Discovery
See this paper in CORE
Go to the repository landing page
Download from data provider
oai:eprints.ucl.ac.uk.OAI2:100...
Last time updated on 04/06/2021