Location of Repository

Parity Still Isn't a Generalisation Problem

By R.I. Damper

Abstract

Clark and Thornton take issue with my claim that parity is not a generalisation problem, and that nothing can be inferred about back-propagation in particular, or learning in general, from failures of parity-generalisation. They advance arguments to support their contention that generalisation is a relevant issue. In this continuing commentary, I examine generalisation more closely in order to refute these arguments. Different learning algorithms will have different patterns of failure: back-propagation has no special status in this respect. This is not to deny that a particular algorithm might fortuitously happen to produce the `intended' function in an (oxymoronic) parity-generalisation task. Clark and Thornton (C&T) distinguish between straightforward type-1 problems which are "statistical" and problems of type-2 which are "relational". The former are learnable by an `uninformed' learning device, they say, while the latter require some sort of recoding to become learnable. C&T cit..

Year: 2007
OAI identifier: oai:CiteSeerX.psu:10.1.1.32.4351
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.bib.ecs.soton.ac.uk... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.