72 research outputs found
Irregularity in the cortical spike code : noise or information?
How random is the discharge pattern of cortical neurons? We examined recordings
from primary visual cortex (V1) and extrastriate cortex (MT) of awake,
behaving macaque monkey, and compared them to analytical predictions. We
measured two indices of firing variability: the ratio of the variance to the
mean for the number of action potentials evoked by a constant stimulus, and
the rate-normalized Coefficient of Variation (C_v) of the interspike interval distribution.
Firing in virtually all V1 and MT neurons was nearly consistent
with a completely random process (e.g., C_v ≈ 1).
We tried to model this high variability by small, independent, and random EPSPs
converging onto a leaky integrate-and-fire neuron (Knight, 1972). Both
this and related models predicted very low firing variability ( C_v ≪ 1) for realistic
EPSP depolarizations and membrane time constants. We also simulated
a biophysically very detailed compartmental model of an anatomically reconstructed
and physiologically characterized layer V cat pyramidal cell with passive
dendrites and active soma. If independent, excitatory synaptic input fired
the model cell at the high rates observed in monkey, the C_v and the variability
in the number of spikes were both very low, in agreement with the integrate-and-
fire models but in strong disagreement with the majority of our monkey
data. The simulated cell only produced highly variable firing when Hodgkin-Huxley-
like currents (I_(Na) and very strong I_(DR) were placed on the distal basal
dendrites. Now the simulated neuron acted more as a millisecond-resolution
detector of dendritic spike coincidences than as a temporal integrator, thereby
increasing its bandwidth by an order of magnitude above traditional estimates.
This hypothetical submillisecond coincidence detection mainly uses the cell's
capacitive localization of very transient signals in thin dendrites. For millisecond-level
events, different dendrites in the cell are electrically isolated from one
another by dendritic capacitance, so that the cell can contain many independent
computational units. This de-coupling occurs because charge takes time
to equilibrate inside the cell, and can occur even in the presence of long
membrane time constants.
Simple approximations using cellular parameters (e.g., R_m, C_m, R_i, G_(Na) etc)
can predict many effects of dendritic spiking, as confirmed by detailed compartmental
simulations of the reconstructed pyramidal cell. Such expressions allow
the extension of simulated results to untested parameter regimes. Coincidence-detection
can occur by two methods: (1) Fast charge-equilization inside dendritic
branches creates submillisecond EPSPs in those dendrites, so that individual
branches can spike in response to coincidences among those fast EPSP's,
(2) strong delayed-rectifier currents in dendrites allow the soma to fire only
upon the submillisecond coincidence of two or more dendritic spikes. Such fast
EPSPs and dendritic spikes produce somatic voltages consistent with intracellular
observations. A simple measure of coincidence-detection "effectiveness"
shows that cells containing these hypothetical dendritic spikes are far more
sensitive to coincident EPSPs than to temporally separated ones, and suggest
a conceptual mechanism for fast, parallel, nonlinear computations inside single
cells.
If a simplified model neuron acts as a coincidence-detector of single pulses, networks
of such neurons can solve a simple but important perceptual problem-the
"binding problem" -more easily and flexibly than traditional neurons can.
In a simple toy model, different classes of coincidence-detecting neurons respond
to different aspects of simple visual stimuli, for example shape and
motion. The task of the population of neurons is to respond to multiple simultaneous
stimuli while still identifying those neurons which respond to a particular
stimulus. Because a coincidence-detecting neuron's output spike train
retains some very precise information about the timing of its input spikes, all
neurons which respond the same stimulus will produce output spikes with an
above-random chance of coincidence, and hence will be easily distinguished
from neurons responding to a different stimulus. This scheme uses the traditional
average-rate code to represent each stimulus separately, while using
precise single-spike times to multiplex information about the relation of different
aspects of the stimuli to each other: In this manner the model's highly
irregular spiking actually reflects information rather than noise.</p
Stochastic resonance and finite resolution in a network of leaky integrate-and-fire neurons.
This thesis is a study of stochastic resonance (SR) in a discrete implementation of a leaky integrate-and-fire (LIF) neuron network. The aim was to determine if SR can be realised in limited precision discrete systems implemented on digital hardware.
How neuronal modelling connects with SR is discussed. Analysis techniques for noisy spike trains are described, ranging from rate coding, statistical measures, and signal processing measures like power spectrum and signal-to-noise ratio (SNR). The main problem in computing spike train power spectra is how to get equi-spaced sample amplitudes given the short duration of spikes relative to their frequency. Three different methods of computing the SNR of a spike train given its power spectrum are described. The main problem is how to separate the power at the frequencies of interest from the noise power as the spike train encodes both noise and the signal of interest.
Two models of the LIF neuron were developed, one continuous and one discrete, and the results compared. The discrete model allowed variation of the precision of the simulation values allowing investigation of the effect of precision limitation on SR. The main difference between the two models lies in the evolution of the membrane potential. When both models are allowed to decay from a high start value in the absence of input, the discrete model does not completely discharge while the continuous model discharges to almost zero.
The results of simulating the discrete model on an FPGA and the continuous model on a PC showed that SR can be realised in discrete low resolution digital systems. SR was found to be sensitive to the precision of the values in the simulations. For a single neuron, we find that SR increases between 10 bits and 12 bits resolution after which it saturates. For a feed-forward network with multiple input neurons and one output neuron, SR is stronger with more than 6 input neurons and it saturates at a higher resolution. We conclude that stochastic resonance can manifest in discrete systems though to a lesser extent compared to continuous systems
Dynamics and precursor signs for phase transitions in neural systems
This thesis investigates neural state transitions associated with sleep, seizure and anaesthesia. The aim is to address the question: How does a brain traverse the critical threshold between distinct cortical states, both healthy and pathological? Specifically we are interested in sub-threshold neural behaviour immediately prior to state transition. We use theoretical neural modelling (single spiking neurons, a network of these, and a mean-field continuum limit) and in vitro experiments to address this question.
Dynamically realistic equations of motion for thalamic relay neuron, reticular nuclei, cortical pyramidal and cortical interneuron in different vigilance states are developed, based on the Izhikevich spiking neuron model. A network of cortical neurons is assembled to examine the behaviour of the gamma-producing cortical network and its transition to lower frequencies due to effect of anaesthesia. Then a three-neuron model for the thalamocortical loop for sleep spindles is presented. Numerical simulations of these networks confirms spiking consistent with reported in vivo measurement results, and provides supporting evidence for precursor indicators of imminent phase transition due to occurrence of individual spindles.
To complement the spiking neuron networks, we study the Wilson–Cowan neural mass equations describing homogeneous cortical columns and a 1D spatial cluster of such columns. The abstract representation of cortical tissue by a pair of coupled integro-differential equations permits thorough linear stability, phase plane and bifurcation analyses. This model shows a rich set of spatial and temporal bifurcations marking the boundary to state transitions: saddle-node, Hopf, Turing, and mixed Hopf–Turing. Close to state transition, white-noise-induced subthreshold fluctuations show clear signs of critical slowing down with prolongation and strengthening of autocorrelations, both in time and space, irrespective of bifurcation type.
Attempts at in vitro capture of these predicted leading indicators form the last part of the thesis. We recorded local field potentials (LFPs) from cortical and hippocampal slices of mouse brain. State transition is marked by the emergence and cessation of spontaneous seizure-like events (SLEs) induced by bathing the slices in an artificial cerebral spinal fluid containing no magnesium ions. Phase-plane analysis of the LFP time-series suggests that distinct bifurcation classes can be responsible for state change to seizure. Increased variance and growth of spectral power at low frequencies (f < 15 Hz) was observed in LFP recordings prior to initiation of some SLEs. In addition we demonstrated prolongation of electrically evoked potentials in cortical tissue, while forwarding the slice to a seizing regime. The results offer the possibility of capturing leading temporal indicators prior to seizure generation, with potential consequences for understanding epileptogenesis.
Guided by dynamical systems theory this thesis captures evidence for precursor signs of phase transitions in neural systems using mathematical and computer-based modelling as well as in vitro experiments
Fractals in the Nervous System: conceptual Implications for Theoretical Neuroscience
This essay is presented with two principal objectives in mind: first, to
document the prevalence of fractals at all levels of the nervous system, giving
credence to the notion of their functional relevance; and second, to draw
attention to the as yet still unresolved issues of the detailed relationships
among power law scaling, self-similarity, and self-organized criticality. As
regards criticality, I will document that it has become a pivotal reference
point in Neurodynamics. Furthermore, I will emphasize the not yet fully
appreciated significance of allometric control processes. For dynamic fractals,
I will assemble reasons for attributing to them the capacity to adapt task
execution to contextual changes across a range of scales. The final Section
consists of general reflections on the implications of the reviewed data, and
identifies what appear to be issues of fundamental importance for future
research in the rapidly evolving topic of this review
Aspects of Signal Processing in Noisy Neurons
In jüngerer Zeit hat sich die Erkenntnis durchgesetzt, daß statistische Einflüsse, oft Rauschen genannt, die Verarbeitung von Signalen nicht notwendig behindern, sondern unterstützen können. Dieser Effekt ist als stochastische Resonanz bekannt geworden. Es liegt nahe, daß die Evolution Wege gefunden hat, diese Phänomen zur Optimierung der Informationsverarbeitung im Nervensystem auszunutzen. Diese Dissertation untersucht am Beispiel des pulserzeugenden Integratorneurons mit Leckstrom, ob die Kodierung periodischer Signale in Neuronen durch das ohnehin im Nervensystem vorhandene Rauschen verbessert wird. Die Untersuchung erfolgt mit den Methoden der Theorie der Punktprozesse. Die Verteilung der Intervalle zwischen zwei beliebigen aufeinanderfolgenden Pulsen, die das Neuron aussendet, wird aus einem Integralgleichungsansatz numerisch bestimmt und die zeitliche Ordnung der Pulsfolgen relativ zum periodischen Signal als Markoffkette beschrieben. Daneben werden einige Näherungsmodelle für die Pulsintervallverteilung, die weitergehende analytische Untersuchungen erlauben, vorgestellt und ihre Zuverlässigkeit geprüft. Als wesentliches Ergebnis wird gezeigt, daß im Modellneuron zwei Arten rauschinduzierter Resonanz auftreten: zum einen klassiche stochastische Resonanz, d.h. ein optimales Signal-Rausch-Verhältnis der evozierten Pulsfolge bei einer bestimmten Amplitude des Eingangsrauschens. Hinzu tritt eine Resonanz bezüglich der Frequenz des Eingangssignals oder Reizes. Reize eines bestimmten Frequenzbereichs werden in Pulsfolgen kodiert, die zeitlich deutlich strukturiert sind, währ! end Stimuli außerhalb des bevorzugten Frequenzbandes zeitlich homogenere Pulsfolgen auslösen. Für diese zweifache Resonanz wird der Begriff stochastische Doppelresonanz eingeführt. Der Effekt wird auf elementare Mechanismen zurückgeführt und seine Abhängigkeit von den Eigenschaften des Reizes umfassend untersucht. Dabei zeigt sich ,daß die Reizantwort des Neurons einfachen Skalengesetzen unterliegt. Insbesondere ist die optimale skalierte Rauschamplitude ein universeller Parameter des Modells, der vom Reiz unabhängig zu sein scheint. Die optimale Reizfrequenz hängt hingegen linear von der skalierten Reizamplitude ab, wobei die Proportionalitätskonstante vom Gleichstromanteil des Reizes bestimmt wird (Basisstrom). Während große Basisströme Frequenz und Amplitude nahezu entkoppeln, so daß Reize beliebiger Amplitude in zeitlich wohlstrukturierten Pulsfolgen kodiert werden, erlauben es kleine Basisströme, das optimale Frequenzband durch Veränderung der Reizamplitude zu wählen
Toward a further understanding of object feature binding: a cognitive neuroscience perspective.
The aim of this thesis is to lead to a further understanding of the neural mechanisms underlying object feature binding in the human brain. The focus is on information processing and integration in the visual system and visual shortterm memory. From a review of the literature it is clear that there are three major
competing binding theories, however, none of these individually solves the binding problem satisfactorily. Thus the aim of this research is to conduct behavioural experimentation into object feature binding, paying particular attention to visual short-term memory.
The behavioural experiment was designed and conducted using a within-subjects delayed responset ask comprising a battery of sixty-four composite objects each with three features and four dimensions in each of three conditions (spatial, temporal and spatio-temporal).Findings from the experiment,which focus on spatial and temporal aspects of object feature binding and feature proximity on
binding errors, support the spatial theories on object feature binding, in addition we propose that temporal theories and convergence, through hierarchical feature
analysis, are also involved. Because spatial properties have a dedicated processing neural stream, and temporal properties rely on limited capacity memory systems, memories for sequential information would likely be more
difficult to accuratelyr ecall. Our study supports other studies which suggest that both spatial and temporal coherence to differing degrees,may be involved in
object feature binding. Traditionally, these theories have purported to provide individual solutions, but this thesis proposes a novel unified theory of object feature binding in which hierarchical feature analysis, spatial attention and temporal synchrony each plays a role. It is further proposed that binding takes place in visual short-term memory through concerted and integrated information
processing in distributed cortical areas. A cognitive model detailing this integrated proposal is given. Next, the cognitive model is used to inform the design and suggested implementation of a computational model which would be
able to test the theory put forward in this thesis. In order to verify the model, future work is needed to implement the computational model.Thus it is argued
that this doctoral thesis provides valuable experimental evidence concerning spatio-temporal aspects of the binding problem and as such is an additional building block in the quest for a solution to the object feature binding problem
- …