3 research outputs found
Aprendizado supervisionado usando redes neurais construtivas.
Constructive neural learning is a neural learning model that does not assume a fixed
network topology before training begins. The main characteristic of this learning model is the
dynamic construction of the network s hidden layers that occurs simultaneously with training.
This work investigates three topics related to constructive neural learning namely
algorithms for training an individual TLU, constructive neural algorithms for two class
problems and constructive neural algorithms for multiclass problems.
The first research topic is approached by discussing a few TLU training algorithms,
namely Perceptron, Pocket, Thermal, Modified Thermal, MinOver and BCP.
This work approaches constructive neural learning for two class classification tasks by
initially reviewing Tower, Pyramid, Tiling and Upstart algorithms, aiming at their multiclass
versions. Next five constructive neural algorithms namely Shift, Offset, PTI, Perceptron
Cascade and Sequential are investigated and two hybrid algorithms are proposed: Hybrid
Tiling, that does not restrict the TLU s training to only one algorithm and the OffTiling, a
collaborative approach based on Tiling and Offset.
Multiclass constructive neural learning was approached by investigating TLUs
training algorithms that deal with multiclass as well as by investigating multiclass versions of
Tower, Pyramid, Tiling, Upstart and Perceptron Cascade.
This research work also describes an empirical evaluation of all the investigated
algorithms conducted using several knowledge domains. Results are discussed and analyzed.Financiadora de Estudos e ProjetosAprendizado neural construtivo é um modelo de aprendizado neural que não pressupõe
a definição de uma topologia de rede fixada antes do início do treinamento. A principal
característica deste modelo de aprendizado é a construção dinâmica das camadas
intermediárias da rede, à medida que vão sendo necessárias ao seu treinamento.
Este trabalho investiga três frentes de pesquisas com relação ao aprendizado neural
construtivo, a saber, algoritmos para o treinamento de TLUs, algoritmos neurais construtivos
para problemas que envolvem duas classes e algoritmos neurais construtivos para o
tratamento de problemas multiclasses.
Com relação à primeira frente de pesquisa os algoritmos discutidos para o treinamento
de TLUs são o Perceptron, o Pocket, o PMR, o Thermal, o Thermal Modificado, o MinOver e
o BPC.
Na frente de pesquisa relativa ao aprendizado neural construtivo para duas classes são
revistos os algoritmos Tower, Pyramid, Tiling e Upstart, para que as versões multiclasses
desses algoritmos possam ser tratadas. São investigados os algoritmos neurais construtivos
Shift, Offset, PTI, Perceptron Cascade e Sequential e propostos dois algoritmos híbridos: o
Tiling Híbrido, que não restringe o treinamento de TLUs a um único algoritmo e o OffTiling
que agrega os algoritmos Tiling e Offset.
A frente que focaliza o aprendizado neural construtivo multiclasse investiga os
algoritmos para o treinamento de TLUs quando o problema envolvido apresentar mais que
duas classes bem como apresenta e discute as versões multiclasses dos algoritmos Tower,
Pyramid, Tiling, Upstart e Perceptron Cascade.
O trabalho descreve uma avaliação empírica dos algoritmos investigados, em vários
domínios de conhecimento bem como discute e analisa os resultados obtidos
Enhancing classification performance using attribute-oriented functionally expanded data
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)There are many data pre-processing techniques that aim at enhancing the quality of classifiers induced by machine learning algorithms. Functional expansions (FE) are one of such techniques, which has been originally proposed to aid neural network based classification. Despite of being successfully employed, works reported in the literature use the same functional expansion, with the same expansion size (ES), applied to each attribute that describes the training data. In this paper it is argued that FE and ES can be attribute-oriented and, by choosing the most suitable FE-SE pair for each attribute, the input data representation improves and, as a consequence, learning algorithms can induce better classifiers. This paper proposes, as a pre-processing step to learning algorithms, a method that uses a genetic algorithm for searching for a suitable FE-SE pair for each data attribute, aiming at producing functionally extended training data. Experimental results using functionally expanded training sets, considering four classification algorithms, KNN, CART, SVM and RBNN, have confirmed the hypothesis; the proposed method for searching for FE-SE pairs through an attribute-oriented fashion has yielded statistically significant better results than learning from the original data or by considering the result from the best FE-SE pair for all attributes. (C) 2017 Elsevier B.V. All rights reserved.There are many data pre-processing techniques that aim at enhancing the quality of classifiers induced by machine learning algorithms. Functional expansions (FE) are one of such techniques, which has been originally proposed to aid neural network based cl893945CAPES - COORDENAÇÃO DE APERFEIÇOAMENTO DE PESSOAL DE NÍVEL SUPERIORCNPQ - CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICOCoordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)sem informaçãosem informaçãoThe authors thank CAPES and CNPq for the research grant received