3 research outputs found

    Some counter-examples to Page's notion of “localist”

    No full text

    Synaptic noise as a means of implementing weight-perturbation learning

    No full text
    Weight-perturbation (WP) algorithms for supervised and/or reinforcement learning offer improved biological plausibility over backpropagation because of their reduced circuitry requirements for realization in neural hardware. All such algorithms use some form of information source — a means to compare weight changes with changes in output error — to adjust weights. This paper explores the hypothesis that biological synaptic noise might serve as the substrate by which weight perturbation is implemented. We explore the basic synaptic noise hypothesis (BSNH) which embodies the weakest assumptions about the underlying neural circuitry required to implement WP algorithms. The present paper identifies relevant biological constraints consistent with the BSNH, taxonomizes existing WP algorithms in regard to consistency with those constraints, and proposes a new WP algorithm that is fully consistent with the constraints. By comparing the learning effectiveness of these algorithms via simulation studies, we find that all of the algorithms can support traditional neural network learning tasks and have similar generalization characteristics, although the results suggest a trade-off between learning efficiency and biological accuracy. This establishes the basic result that biological synaptic noise, coupled with appropriate reward, can be exploited to implement WP algorithms for neural network learning.
    corecore