3,661 research outputs found

    Electrotonic signal processing in AII amacrine cells: compartmental models and passive membrane properties for a gap junction-coupled retinal neuron

    Get PDF
    Under embargo until: 14.06.2019Amacrine cells are critical for processing of visual signals, but little is known about their electrotonic structure and passive membrane properties. AII amacrine cells are multifunctional interneurons in the mammalian retina and essential for both rod- and cone-mediated vision. Their dendrites are the site of both input and output chemical synapses and gap junctions that form electrically coupled networks. This electrical coupling is a challenge for developing realistic computer models of single neurons. Here, we combined multiphoton microscopy and electrophysiological recording from dye-filled AII amacrine cells in rat retinal slices to develop morphologically accurate compartmental models. Passive cable properties were estimated by directly fitting the current responses of the models evoked by voltage pulses to the physiologically recorded responses, obtained after blocking electrical coupling. The average best-fit parameters (obtained at − 60 mV and ~ 25 °C) were 0.91 ”F cm−2 for specific membrane capacitance, 198 Ω cm for cytoplasmic resistivity, and 30 kΩ cm2 for specific membrane resistance. We examined the passive signal transmission between the cell body and the dendrites by the electrotonic transform and quantified the frequency-dependent voltage attenuation in response to sinusoidal current stimuli. There was significant frequency-dependent attenuation, most pronounced for signals generated at the arboreal dendrites and propagating towards the soma and lobular dendrites. In addition, we explored the consequences of the electrotonic structure for interpreting currents in somatic, whole-cell voltage-clamp recordings. The results indicate that AII amacrines cannot be characterized as electrotonically compact and suggest that their morphology and passive properties can contribute significantly to signal integration and processing.acceptedVersio

    Imperfect Space Clamp Permits Electrotonic Interactions between Inhibitory and Excitatory Synaptic Conductances, Distorting Voltage Clamp Recordings

    Get PDF
    The voltage clamp technique is frequently used to examine the strength and composition of synaptic input to neurons. Even accounting for imperfect voltage control of the entire cell membrane (“space clamp”), it is often assumed that currents measured at the soma are a proportional indicator of the postsynaptic conductance. Here, using NEURON simulation software to model somatic recordings from morphologically realistic neurons, we show that excitatory conductances recorded in voltage clamp mode are distorted significantly by neighboring inhibitory conductances, even when the postsynaptic membrane potential starts at the reversal potential of the inhibitory conductance. Analogous effects are observed when inhibitory postsynaptic currents are recorded at the reversal potential of the excitatory conductance. Escape potentials in poorly clamped dendrites reduce the amplitude of excitatory or inhibitory postsynaptic currents recorded at the reversal potential of the other conductance. In addition, unclamped postsynaptic inhibitory conductances linearize the recorded current-voltage relationship of excitatory inputs comprising AMPAR and NMDAR-mediated components, leading to significant underestimation of the relative contribution by NMDARs, which are particularly sensitive to small perturbations in membrane potential. Voltage clamp accuracy varies substantially between neurons and dendritic arbors of different morphology; as expected, more reliable recordings are obtained from dendrites near the soma, but up to 80% of the synaptic signal on thin, distant dendrites may be lost when postsynaptic interactions are present. These limitations of the voltage clamp technique may explain how postsynaptic effects on synaptic transmission could, in some cases, be attributed incorrectly to presynaptic mechanisms

    Using a virtual cortical module implementing a neural field model to modulate brain rhythms in Parkinson’s disease

    Get PDF
    We propose a new method for selective modulation of cortical rhythms based on neural field theory, in which the activity of a cortical area is extensively monitored using a two-dimensional microelectrode array. The example of Parkinson’s disease illustrates the proposed method, in which a neural field model is assumed to accurately describe experimentally recorded activity. In addition, we propose a new closed-loop stimulation signal that is both space- and time- dependent. This method is especially designed to specifically modulate a targeted brain rhythm, without interfering with other rhythms. A new class of neuroprosthetic devices is also proposed, in which the multielectrode array is seen as an artificial neural network interacting with biological tissue. Such a bio-inspired approach may provide a solution to optimize interactions between the stimulation device and the cortex aiming to attenuate or augment specific cortical rhythms. The next step will be to validate this new approach experimentally in patients with Parkinson’s disease

    Sequential estimation of intrinsic activity and synaptic input in single neurons by particle filtering with optimal importance density

    Get PDF
    This paper deals with the problem of inferring the signals and parameters that cause neural activity to occur. The ultimate challenge being to unveil brain’s connectivity, here we focus on a microscopic vision of the problem, where single neurons (potentially connected to a network of peers) are at the core of our study. The sole observation available are noisy, sampled voltage traces obtained from intracellular recordings. We design algorithms and inference methods using the tools provided by stochastic filtering that allow a probabilistic interpretation and treatment of the problem. Using particle filtering, we are able to reconstruct traces of voltages and estimate the time course of auxiliary variables. By extending the algorithm, through PMCMC methodology, we are able to estimate hidden physiological parameters as well, like intrinsic conductances or reversal potentials. Last, but not least, the method is applied to estimate synaptic conductances arriving at a target cell, thus reconstructing the synaptic excitatory/inhibitory input traces. Notably, the performance of these estimations achieve the theoretical lower bounds even in spiking regimes.Postprint (published version

    Lognormal firing rate distribution reveals prominent fluctuation-driven regime in spinal motor networks

    Get PDF
    When spinal circuits generate rhythmic movements it is important that the neuronal activity remains within stable bounds to avoid saturation and to preserve responsiveness. Here, we simultaneously record from hundreds of neurons in lumbar spinal circuits of turtles and establish the neuronal fraction that operates within either a ‘mean-driven’ or a ‘fluctuation–driven’ regime. Fluctuation-driven neurons have a ‘supralinear’ input-output curve, which enhances sensitivity, whereas the mean-driven regime reduces sensitivity. We find a rich diversity of firing rates across the neuronal population as reflected in a lognormal distribution and demonstrate that half of the neurons spend at least 50 [Formula: see text] of the time in the ‘fluctuation–driven’ regime regardless of behavior. Because of the disparity in input–output properties for these two regimes, this fraction may reflect a fine trade–off between stability and sensitivity in order to maintain flexibility across behaviors. DOI: http://dx.doi.org/10.7554/eLife.18805.00

    Statistical Physics and Representations in Real and Artificial Neural Networks

    Full text link
    This document presents the material of two lectures on statistical physics and neural representations, delivered by one of us (R.M.) at the Fundamental Problems in Statistical Physics XIV summer school in July 2017. In a first part, we consider the neural representations of space (maps) in the hippocampus. We introduce an extension of the Hopfield model, able to store multiple spatial maps as continuous, finite-dimensional attractors. The phase diagram and dynamical properties of the model are analyzed. We then show how spatial representations can be dynamically decoded using an effective Ising model capturing the correlation structure in the neural data, and compare applications to data obtained from hippocampal multi-electrode recordings and by (sub)sampling our attractor model. In a second part, we focus on the problem of learning data representations in machine learning, in particular with artificial neural networks. We start by introducing data representations through some illustrations. We then analyze two important algorithms, Principal Component Analysis and Restricted Boltzmann Machines, with tools from statistical physics

    Information efficacy of a dynamic synapse

    Get PDF
    • 

    corecore