Skip to main content
Article thumbnail
Location of Repository

Training Multi-layer Perceptrons Using MiniMin Approach

By Liefeng Bo, Ling Wang and Licheng Jiao

Abstract

Abstract. Multi-layer perceptrons (MLPs) have been widely used in classification and regression task. How to improve the training speed of MLPs has been an interesting field of research. Instead of the classical method, we try to train MLPs by a MiniMin model which can ensure that the weights of the last layer are optimal at each step. Significant improvement on training speed has been made using our method for several big benchmark data sets.

Year: 2011
OAI identifier: oai:CiteSeerX.psu:10.1.1.187.7748
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.cs.washington.edu/h... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.