1,215 research outputs found
Mechanism, dynamics, and biological existence of multistability in a large class of bursting neurons
Multistability, the coexistence of multiple attractors in a dynamical system,
is explored in bursting nerve cells. A modeling study is performed to show that
a large class of bursting systems, as defined by a shared topology when
represented as dynamical systems, is inherently suited to support
multistability. We derive the bifurcation structure and parametric trends
leading to multistability in these systems. Evidence for the existence of
multirhythmic behavior in neurons of the aquatic mollusc Aplysia californica
that is consistent with our proposed mechanism is presented. Although these
experimental results are preliminary, they indicate that single neurons may be
capable of dynamically storing information for longer time scales than
typically attributed to nonsynaptic mechanisms.Comment: 24 pages, 8 figure
Modeling of Spiking-Bursting Neural Behavior Using Two-Dimensional Map
A simple model that replicates the dynamics of spiking and spiking-bursting
activity of real biological neurons is proposed. The model is a two-dimensional
map which contains one fast and one slow variable. The mechanisms behind
generation of spikes, bursts of spikes, and restructuring of the map behavior
are explained using phase portrait analysis. The dynamics of two coupled maps
which model the behavior of two electrically coupled neurons is discussed.
Synchronization regimes for spiking and bursting activity of these maps are
studied as a function of coupling strength. It is demonstrated that the results
of this model are in agreement with the synchronization of chaotic
spiking-bursting behavior experimentally found in real biological neurons.Comment: 9 pages, 12 figure
Shared inputs, entrainment, and desynchrony in elliptic bursters: from slow passage to discontinuous circle maps
What input signals will lead to synchrony vs. desynchrony in a group of
biological oscillators? This question connects with both classical dynamical
systems analyses of entrainment and phase locking and with emerging studies of
stimulation patterns for controlling neural network activity. Here, we focus on
the response of a population of uncoupled, elliptically bursting neurons to a
common pulsatile input. We extend a phase reduction from the literature to
capture inputs of varied strength, leading to a circle map with discontinuities
of various orders. In a combined analytical and numerical approach, we apply
our results to both a normal form model for elliptic bursting and to a
biophysically-based neuron model from the basal ganglia. We find that,
depending on the period and amplitude of inputs, the response can either appear
chaotic (with provably positive Lyaponov exponent for the associated circle
maps), or periodic with a broad range of phase-locked periods. Throughout, we
discuss the critical underlying mechanisms, including slow-passage effects
through Hopf bifurcation, the role and origin of discontinuities, and the
impact of noiseComment: 17 figures, 40 page
Probing the dynamics of identified neurons with a data-driven modeling approach
In controlling animal behavior the nervous system has to perform within the operational limits set by the requirements of each specific behavior. The implications for the corresponding range of suitable network, single neuron, and ion channel properties have remained elusive. In this article we approach the question of how well-constrained properties of neuronal systems may be on the neuronal level. We used large data sets of the activity of isolated invertebrate identified cells and built an accurate conductance-based model for this cell type using customized automated parameter estimation techniques. By direct inspection of the data we found that the variability of the neurons is larger when they are isolated from the circuit than when in the intact system. Furthermore, the responses of the neurons to perturbations appear to be more consistent than their autonomous behavior under stationary conditions. In the developed model, the constraints on different parameters that enforce appropriate model dynamics vary widely from some very tightly controlled parameters to others that are almost arbitrary. The model also allows predictions for the effect of blocking selected ionic currents and to prove that the origin of irregular dynamics in the neuron model is proper chaoticity and that this chaoticity is typical in an appropriate sense. Our results indicate that data driven models are useful tools for the in-depth analysis of neuronal dynamics. The better consistency of responses to perturbations, in the real neurons as well as in the model, suggests a paradigm shift away from measuring autonomous dynamics alone towards protocols of controlled perturbations. Our predictions for the impact of channel blockers on the neuronal dynamics and the proof of chaoticity underscore the wide scope of our approach
Extreme phase sensitivity in systems with fractal isochrons
Sensitivity to initial conditions is usually associated with chaotic dynamics
and strange attractors. However, even systems with (quasi)periodic dynamics can
exhibit it. In this context we report on the fractal properties of the
isochrons of some continuous-time asymptotically periodic systems. We define a
global measure of phase sensitivity that we call the phase sensitivity
coefficient and show that it is an invariant of the system related to the
capacity dimension of the isochrons. Similar results are also obtained with
discrete-time systems. As an illustration of the framework, we compute the
phase sensitivity coefficient for popular models of bursting neurons,
suggesting that some elliptic bursting neurons are characterized by isochrons
of high fractal dimensions and exhibit a very sensitive (unreliable) phase
response.Comment: 32 page
Spike-Train Responses of a Pair of Hodgkin-Huxley Neurons with Time-Delayed Couplings
Model calculations have been performed on the spike-train response of a pair
of Hodgkin-Huxley (HH) neurons coupled by recurrent excitatory-excitatory
couplings with time delay. The coupled, excitable HH neurons are assumed to
receive the two kinds of spike-train inputs: the transient input consisting of
impulses for the finite duration (: integer) and the sequential input
with the constant interspike interval (ISI). The distribution of the output ISI
shows a rich of variety depending on the coupling strength and the
time delay. The comparison is made between the dependence of the output ISI for
the transient inputs and that for the sequential inputs.Comment: 19 pages, 4 figure
Dynamical principles in neuroscience
Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and Fundación BBVA
Transmission of Information in Active Networks
Shannon's Capacity Theorem is the main concept behind the Theory of
Communication. It says that if the amount of information contained in a signal
is smaller than the channel capacity of a physical media of communication, it
can be transmitted with arbitrarily small probability of error. This theorem is
usually applicable to ideal channels of communication in which the information
to be transmitted does not alter the passive characteristics of the channel
that basically tries to reproduce the source of information. For an {\it active
channel}, a network formed by elements that are dynamical systems (such as
neurons, chaotic or periodic oscillators), it is unclear if such theorem is
applicable, once an active channel can adapt to the input of a signal, altering
its capacity. To shed light into this matter, we show, among other results, how
to calculate the information capacity of an active channel of communication.
Then, we show that the {\it channel capacity} depends on whether the active
channel is self-excitable or not and that, contrary to a current belief,
desynchronization can provide an environment in which large amounts of
information can be transmitted in a channel that is self-excitable. An
interesting case of a self-excitable active channel is a network of
electrically connected Hindmarsh-Rose chaotic neurons.Comment: 15 pages, 5 figures. submitted for publication. to appear in Phys.
Rev.
- …