Article thumbnail
Location of Repository

Vector Symbolic Architectures answer Jackendoff's challenges for cognitive neuroscience

By Dr Ross W. Gayler


Jackendoff (2002) posed four challenges that linguistic combinatoriality and rules of language present to theories of brain function. The essence of these problems is the question of how to neurally instantiate the rapid construction and transformation of the compositional structures that are typically taken to be the domain of symbolic processing. He contended that typical connectionist approaches fail to meet these challenges and that the dialogue between linguistic theory and cognitive neuroscience will be relatively unproductive until the importance of these problems is widely recognised and the challenges answered by some technical innovation in connectionist modelling. This paper claims that a little-known family of connectionist models (Vector Symbolic Architectures) are able to meet Jackendoff's challenges

Topics: Language, Computational Neuroscience, Artificial Intelligence
Year: 2003
OAI identifier:

Suggested articles


  1. (1997). A common framework for distributed representation schemes for compositional structure. In
  2. (2002). Corpus-based methods in language and speech processing.
  3. (1997). Data-oriented language processing: An overview.
  4. (1977). Distinctive features, categorical perception, and probability learning: Some applications of a neural model.
  5. (1994). Distributed representations and nested compositional structure. Doctoral dissertation.
  6. (2002). Foundations of language: Brain, meaning, grammar, evolution. doi
  7. (1997). Fully distributed representation.
  8. (2002). afs/ arch.2002-08.gz see 0005.txt and also 8, 9, 18,
  9. (1990). Tensor product variable binding and the representation of symbolic structures in connectionist systems. doi

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.