25 research outputs found

    A biologically inspired algorithm to deal with filter-overlap in retinal models

    No full text
    A multi-filter retinal model to simulate parallel processing by a population of retinal ganglion cells was proposed in [1] to test rank-order codes [2], a spike-latency based neural code. Dealing with filter-overlap in this model has been an area of concern [3,4]. This is because data redundancy induced by over-sampling of a point in space affects the quantity of salient information during rapid information transmission [5]. We propose a Filter-overlap Correction algorithm (FoCal) to deal with this problem of over-sampled data. The algorithm is based on the lateral inhibition technique [6] used by sensory neurons to deal with data redundancy [7], so that only salient information is transmitted through the optic-nerve bottleneck for rapid object detection and recognition.</p

    Evaluating rank-order code performance using a biologically-derived retinal model

    No full text
    We propose a model of the primate retinal ganglion cell layout corresponding to the foveal-pit to test rank-order codes as a means of sensory information transmission in primate vision. We use the model for encoding images in rank-order. We observe that the model is functional only when the lateral inhibition technique is used to remove redundancy from the sampled data. Further, more than 80% of the input information can be decoded by the time only up to 10% of the ganglion cells of our model have fired their first spikes.</p

    Simulating synaptic rewiring on SpiNNaker

    No full text
    Structural synaptic plasticity is an omnipresent mechanism in mammalian brains, involved in learning, memory, and recovery from lesions. Structural plasticity is also a useful computational tool, used in automatically generating connectivity based on experimental activity data, exploring network states for Bayesian inference and assisting wider-spread synaptic plasticity rules for better performance. <br><br>The structural organisation of cortical areas is not random; topographic maps are common-place in sensory processing centres. Topographic organisation allows optimal wiring between neurons, multimodal sensory integration, and performs input dimensionality reduction. We have designed an efficient framework which can be used to simulate models of structural plasticity on the SpiNNaker neuromorphic system, in real time, in conjunction with synaptic plasticity rules, such as spike-timing dependent plasticity (STDP). A model of generic topographic map formation is implemented, using our framework, making use of both activity-dependent and independent processes. In agreement with the work by Bamford et al. (2010), we show that structural plasticity in the form of synaptic rewiring refines an initially rough topographic map and embeds input preferences into the network connectivity. Additionally, it can also be used to generate topographic maps between layers of neurons with minimal initial connectivity, and stabilize projections which would otherwise be unstable. <br><br>Finally, we show that supervised MNIST handwritten digit classification can be performed in the absence of synaptic plasticity rules (i.e. rules which change the weights or efficacies of connections). This is not a state-of-the-art MNIST classification network (it achieves a modest accuracy of 78% and an RMSE of 2.01 with non-filtered inputs, performance drops when filtered inputs are used: an accuracy of 71% and an RMSE of 2.38) as each input digit class is represented only as an average for that class, but it serves here to demonstrate that synaptic rewiring can enable a network to learn, unsupervised, the statistics of its inputs. Moreover, with the current network and input configuration, the quality of the classification is critically dependent on the sampling mechanism employed in the formation of new synapses. Random rewiring, as opposed to preferentially forming connections to neurons that have spiked recently, could achieve accurate classification only if operating in conjunction with STDP or some other mechanism to prevent the pruning of useful synapses.<br

    Fig. 1

    No full text
    <p>Postsynaptic latency—relative to the pattern start—as a function of discharges. When the neuron discharges outside of the pattern a latency of 0 is shown. The STDP clearly learns the pattern and has similar periods to those observed by Masquelier et al.: 1) when the neuron is nonselective to the pattern and most synaptic weights are being depressed when the neuron is training to the beginning of the pattern; and 3) when the neuron consistently<br>fires within the pattern.</p> <p>This figure is part of the paper entitled "STDP Pattern Onset learning Depends on Background Activity"; see link.</p> <p> </p

    Fig. 6a

    No full text
    <p>When additional noise is added the response latency distributions are affected (N=100).</p> <p>6a - Response latency distributions for varying amounts of background noise (from sigma=0.0 to sigma=0.9) for a pattern of length 50 ms.</p> <p>This figure is part of the paper entitled "STDP Pattern Onset learning Depends on Background Activity"; see link.</p

    Fig. 5a

    No full text
    <p>Two typical results when Gaussian noise is added to the membrane potential.</p> <p>5a one example.</p> <p>This figure is part of the paper entitled "STDP Pattern Onset learning Depends on Background Activity"; see link.</p

    Fig. 3a

    No full text
    <p>Results of a long simulation (3000 s) with an additional injection of constant electrical current at 2000 s.</p> <p>3a shows weight distributions for afferents in the pattern.</p> <p>This figure is part of the paper entitled "STDP Pattern Onset learning Depends on Background Activity"; see link.</p> <p> </p

    The application of spike-timing-dependent plasticity to competitive pattern learning

    No full text
    <p>Two neural architectures are presented: A, a competitive neural network that learns repeated spatio-temporal patterns and 2, a simple (early) cisual model which has orientation specific receptive fields (simple layer). Both networks use STDP based learning.</p

    STDP Pattern Onset Learning Depends on Background Activity

    No full text
    <p>Spike-timing dependent plasticity is a learning mechanism used extensively within neural modelling. The learning rule has been shown to allow a neuron to find the beginning of a repeated spatio-temporal pattern among its afferents. In this study we adduce that such learning is dependent on background activity, and is un-stable when in a noisy framework. We also present insights into the neuron’s encoding.</p

    Fig. 2c

    No full text
    <p>Results of a longer simulation (3000 s) including synaptic weight distributions and trajectories.</p> <p>2c shows the synaptic weight trajectories of 50 afferents in the pattern.</p> <p>This figure is part of the paper entitled "STDP Pattern Onset learning Depends on Background Activity"; see link.</p
    corecore