216 research outputs found

    Computational Properties of Cerebellar Nucleus Neurons: Effects of Stochastic Ion Channel Gating and Input Location

    Get PDF
    The function of the nervous system is shaped by the refined integration of synaptic inputs taking place at the single neuron level. Gain modulation is a computational principle that is widely used across the brain, in which the response of a neuronal unit to a set of inputs is affected in a multiplicative fashion by a second set of inputs, but without any effect on its selectivity. The arithmetic operations performed by pyramidal cells in cortical brain areas have been well characterised, along with the underlying mechanisms at the level of networks and cells, for instance background synaptic noise and dendritic saturation. However, in spite of the vast amount of research on the cerebellum and its function, little is known about neuronal computations carried out by its cellular components. A particular area of interest are the cerebellar nuclei, the main output gate of the cerebellum to the brain stem and cortical areas. The aim of this thesis is to contribute to an understanding of the arithmetic operations performed by neurons in the cerebellar nuclei. Focus is placed on two putative determinants, the location of the synaptic input and the presence of channel noise. To analyse the effect of channel noise, the known voltage-gated ion channels of a cerebellar nucleus neuron model are translated to stochastic Markov formalisms and their electrophysiologial behaviour is compared to their deterministic Hodgkin-Huxley counterparts. The findings demonstrate that in most cases, the behaviour of stochastic channels matches the reference deterministic models, with the notable exception of voltage-gated channels with fast kinetics. Two potential explanations are suggested for this discrepancy. Firstly, channels with fast kinetics are strongly affected by the artefactual loss of gating events in the simulation that is caused by the use of a finite-length time step. While this effect can be mitigated, in part, by using very small time steps, the second source of simulation artefacts is the rectification of the distribution of open channels, when channel kinetics characteristics allow the generation of a window current, with an temporal-averaged equilibrium close to zero. Further, stochastic gating is implemented in a realistic cerebellar nucleus neuronal model. The resulting stochastic model exhibits probabilistic spiking and a similar output rate as the corresponding deterministic cerebellar nucleus neuronal model. However, the outcomes of this thesis indicate the computational properties of the cerebellar nucleus neuronal model are independent of the presence of ion channel noise. The main result of this thesis is that the synaptic input location determines the single neuron computational properties, both in the cerebellar nucleus and layer Vb pyramidal neuronal models. The extent of multiplication increases systematically with the distance from the soma, for the cerebellar nucleus, but not for the layer Vb pyramidal neuron, where it is smaller than it would be expected for the distance from the soma. For both neurons, the underlying mechanism is related to the combined effect of nonlinearities introduced by dendritic saturation and the synaptic input noise. However, while excitatory inputs in the perisomatic areas in the cerebellar nucleus undergo additive operations and the distal areas multiplicative, in the layer Vb pyramidal neuron the integration of the excitatory driving input is always multiplicative. In addition, the change in gain is sensitive to the synchronicity of the excitatory synaptic input in the layer Vb pyramidal neuron, but not in the cerebellar nucleus neuron. These observations indicate that the same gain control mechanism might be utilized in distinct ways, in different computational contexts and across different areas, based on the neuronal type and its function

    Artificial Dendritic Neuron: A Model of Computation and Learning Algorithm

    Get PDF
    Dendrites are root-like extensions from the neuron cell body and have long been thought to serve as the predominant input structures of neurons. Since the early twentieth century, neuroscience research has attempted to define the dendrite’s contribution to neural computation and signal integration. This body of experimental and modeling research strongly indicates that dendrites are not just input structures but are crucial to neural processing. Dendritic processing consists of both active and passive elements that utilize the spatial, electrical and connective properties of the dendritic tree. This work presents a neuron model based around the structure and properties of dendrites. This research assesses the computational benefits and requirements of adding dendrites to a spiking artificial neuron model. A list of the computational properties of actual dendrites that have shaped this work is given. An algorithm capable of generating and training a network of dendritic neurons is created as an investigative tool through which computational challenges and attributes are explored. This work assumes that dendrites provide a necessary and beneficial function to biological intelligence (BI) and their translation into the artificial intelligence (AI) realm would broaden the capabilities and improve the realism of artificial neural network (ANN) research. To date there have been only a few instances in which neural network-based AI research has ventured beyond the point neuron; therefore, the work presented here should be viewed as exploratory. The contribution to AI made by this work is an implementation of the artificial dendritic (AD) neuron model and an algorithm for training AD neurons with spatially distributed inputs with dendrite-like connectivity

    A synaptic learning rule for exploiting nonlinear dendritic computation

    Get PDF
    Information processing in the brain depends on the integration of synaptic input distributed throughout neuronal dendrites. Dendritic integration is a hierarchical process, proposed to be equivalent to integration by a multilayer network, potentially endowing single neurons with substantial computational power. However, whether neurons can learn to harness dendritic properties to realize this potential is unknown. Here, we develop a learning rule from dendritic cable theory and use it to investigate the processing capacity of a detailed pyramidal neuron model. We show that computations using spatial or temporal features of synaptic input patterns can be learned, and even synergistically combined, to solve a canonical nonlinear feature-binding problem. The voltage dependence of the learning rule drives coactive synapses to engage dendritic nonlinearities, whereas spike-timing dependence shapes the time course of subthreshold potentials. Dendritic input-output relationships can therefore be flexibly tuned through synaptic plasticity, allowing optimal implementation of nonlinear functions by single neurons

    Global and Multiplexed Dendritic Computations under In Vivo-like Conditions.

    Get PDF
    Dendrites integrate inputs nonlinearly, but it is unclear how these nonlinearities contribute to the overall input-output transformation of single neurons. We developed statistically principled methods using a hierarchical cascade of linear-nonlinear subunits (hLN) to model the dynamically evolving somatic response of neurons receiving complex, in vivo-like spatiotemporal synaptic input patterns. We used the hLN to predict the somatic membrane potential of an in vivo-validated detailed biophysical model of a L2/3 pyramidal cell. Linear input integration with a single global dendritic nonlinearity achieved above 90% prediction accuracy. A novel hLN motif, input multiplexing into parallel processing channels, could improve predictions as much as conventionally used additional layers of local nonlinearities. We obtained similar results in two other cell types. This approach provides a data-driven characterization of a key component of cortical circuit computations: the input-output transformation of neurons during in vivo-like conditions

    Statistical physics of neural systems with non-additive dendritic coupling

    Full text link
    How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such non-additive dendritic processing on single neuron responses and the performance of associative memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality
    • …
    corecore