800 research outputs found

    Competing synapses with two timescales: a basis for learning and forgetting

    Full text link
    Competitive dynamics are thought to occur in many processes of learning involving synaptic plasticity. Here we show, in a game theory-inspired model of synaptic interactions, that the competition between synapses in their weak and strong states gives rise to a natural framework of learning, with the prediction of memory inherent in a timescale for `forgetting' a learned signal. Among our main results is the prediction that memory is optimized if the weak synapses are really weak, and the strong synapses are really strong. Our work admits of many extensions and possible experiments to test its validity, and in particular might complement an existing model of reaching, which has strong experimental support.Comment: 7 pages, 3 figures, to appear in Europhysics Letter

    Adherent carbon film deposition by cathodic arc with implantation

    Get PDF
    A method of improving the adhesion of carbon thin films deposited using a cathodic vacuum arc by the use of implantation at energies up to 20 keV is described. A detailed analysis of carbon films deposited onto silicon in this way is carried out using complementary techniques of transmission electron microscopy and x-ray photoelectron spectroscopy (XPS) is presented. This analysis shows that an amorphous mixing layer consisting of carbon and silicon is formed between the grown pure carbon film and the crystalline silicon substrate. In the mixing layer, it is shown that some chemical bonding occurs between carbon and silicon. Damage to the underlying crystalline silicon substrate is observed and believed to be caused by interstitial implanted carbon atoms which XPS shows are not bonded to the silicon. The effectiveness of this technique is confirmed by scratch testing and by analysis with scanning electron microscopy which shows failure of the silicon substrate occurs before delamination of the carbon film

    Noise Induced Coherence in Neural Networks

    Full text link
    We investigate numerically the dynamics of large networks of NN globally pulse-coupled integrate and fire neurons in a noise-induced synchronized state. The powerspectrum of an individual element within the network is shown to exhibit in the thermodynamic limit (NN\to \infty) a broadband peak and an additional delta-function peak that is absent from the powerspectrum of an isolated element. The powerspectrum of the mean output signal only exhibits the delta-function peak. These results are explained analytically in an exactly soluble oscillator model with global phase coupling.Comment: 4 pages ReVTeX and 3 postscript figure

    Supervised Learning in Multilayer Spiking Neural Networks

    Get PDF
    The current article introduces a supervised learning algorithm for multilayer spiking neural networks. The algorithm presented here overcomes some limitations of existing learning algorithms as it can be applied to neurons firing multiple spikes and it can in principle be applied to any linearisable neuron model. The algorithm is applied successfully to various benchmarks, such as the XOR problem and the Iris data set, as well as complex classifications problems. The simulations also show the flexibility of this supervised learning algorithm which permits different encodings of the spike timing patterns, including precise spike trains encoding.Comment: 38 pages, 4 figure

    Comparison of Langevin and Markov channel noise models for neuronal signal generation

    Full text link
    The stochastic opening and closing of voltage-gated ion channels produces noise in neurons. The effect of this noise on the neuronal performance has been modelled using either approximate or Langevin model, based on stochastic differential equations or an exact model, based on a Markov process model of channel gating. Yet whether the Langevin model accurately reproduces the channel noise produced by the Markov model remains unclear. Here we present a comparison between Langevin and Markov models of channel noise in neurons using single compartment Hodgkin-Huxley models containing either Na+Na^{+} and K+K^{+}, or only K+K^{+} voltage-gated ion channels. The performance of the Langevin and Markov models was quantified over a range of stimulus statistics, membrane areas and channel numbers. We find that in comparison to the Markov model, the Langevin model underestimates the noise contributed by voltage-gated ion channels, overestimating information rates for both spiking and non-spiking membranes. Even with increasing numbers of channels the difference between the two models persists. This suggests that the Langevin model may not be suitable for accurately simulating channel noise in neurons, even in simulations with large numbers of ion channels

    Particle trajectories in linearized irrotational shallow water flows

    Full text link
    We investigate the particle trajectories in an irrotational shallow water flow over a flat bed as periodic waves propagate on the water's free surface. Within the linear water wave theory, we show that there are no closed orbits for the water particles beneath the irrotational shallow water waves. Depending on the strength of underlying uniform current, we obtain that some particle trajectories are undulating path to the right or to the left, some are looping curves with a drift to the right and others are parabolic curves or curves which have only one loop

    Self-organization without conservation: Are neuronal avalanches generically critical?

    Full text link
    Recent experiments on cortical neural networks have revealed the existence of well-defined avalanches of electrical activity. Such avalanches have been claimed to be generically scale-invariant -- i.e. power-law distributed -- with many exciting implications in Neuroscience. Recently, a self-organized model has been proposed by Levina, Herrmann and Geisel to justify such an empirical finding. Given that (i) neural dynamics is dissipative and (ii) there is a loading mechanism "charging" progressively the background synaptic strength, this model/dynamics is very similar in spirit to forest-fire and earthquake models, archetypical examples of non-conserving self-organization, which have been recently shown to lack true criticality. Here we show that cortical neural networks obeying (i) and (ii) are not generically critical; unless parameters are fine tuned, their dynamics is either sub- or super-critical, even if the pseudo-critical region is relatively broad. This conclusion seems to be in agreement with the most recent experimental observations. The main implication of our work is that, if future experimental research on cortical networks were to support that truly critical avalanches are the norm and not the exception, then one should look for more elaborate (adaptive/evolutionary) explanations, beyond simple self-organization, to account for this.Comment: 28 pages, 11 figures, regular pape

    Population coding by globally coupled phase oscillators

    Full text link
    A system of globally coupled phase oscillators subject to an external input is considered as a simple model of neural circuits coding external stimulus. The information coding efficiency of the system in its asynchronous state is quantified using Fisher information. The effect of coupling and noise on the information coding efficiency in the stationary state is analyzed. The relaxation process of the system after the presentation of an external input is also studied. It is found that the information coding efficiency exhibits a large transient increase before the system relaxes to the final stationary state.Comment: 7 pages, 9 figures, revised version, new figures added, to appear in JPSJ Vol 75, No.

    Short-term-plasticity orchestrates the response of pyramidal cells and interneurons to population bursts

    Get PDF
    The synaptic drive from neuronal populations varies considerably over short time scales. Such changes in the presynaptic rate trigger many temporal processes absent under steady-state conditions. This paper examines the differential impact of pyramidal cell population bursts on postsynaptic pyramidal cells receiving depressing synapses, and on a class of interneuron that receives facilitating synapses. In experiments a significant shift of the order of of one hundred milliseconds is seen between the response of these two cell classes to the same population burst. It is demonstrated here that such a temporal differentiation of the response can be explained by the synaptic and membranme properties without recourse to elaborate cortical wiring schemes
    corecore