221 research outputs found

    Optimization and parallelization of tensor and ODE/PDE computations on GPU

    Get PDF
    We propose a multi-level GPU-based parallelization algorithm to solve the multi-compartment Hodgkin Huxley (HH) model equation that requires solving the Hines matrix. We use a ‘parallel-in-time’ algorithm (like the Parareal strategy) for obtaining outer level parallelism, and an Exact Domain Decomposition (EDD) algorithm with fine-decomposition for inner-level parallelism. We show that our technique can also be applied to any differential equation like the heat equations which induce tridiagonal systems. Typically, a solution to the HH equation runs for hundreds to tens of thousands of time-steps while solving a Hines matrix at each time step. Previous solutions by Michael Mascagni et al. (1991) and Hines et al. (2008) to this problem have tackled only solving the Hines matrix in parallel. Our approach uses the dynamic parallelism of CUDA to achieve multi-level parallelism on GPUs. Our solution outperforms the sequential time method on standard neuron morphologies upto 2.5x. We also show that iterative part of parareal method converges in 5-7 iterations on average with an accuracy of 10−6. We also propose a GPU optimization for the Higher Order Tensor Renormalization Group problem, where the tensor contraction operations inside HOTRG is optimized by a multi- GPU implementation using cuBLAS xt API

    Simulation of networks of spiking neurons: A review of tools and strategies

    Full text link
    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of Computational Neuroscience, in press (2007

    Computational convergence of the path integral for real dendritic morphologies

    Get PDF
    Neurons are characterised by a morphological structure unique amongst biological cells, the core of which is the dendritic tree. The vast number of dendritic geometries, combined with heterogeneous properties of the cell membrane, continue to challenge scientists in predicting neuronal input-output relationships, even in the case of sub-threshold dendritic currents. The Green’s function obtained for a given dendritic geometry provides this functional relationship for passive or quasi-active dendrites and can be constructed by a sum-over-trips approach based on a path integral formalism. In this paper, we introduce a number of efficient algorithms for realisation of the sum-over-trips framework and investigate the convergence of these algorithms on different dendritic geometries. We demonstrate that the convergence of the trip sampling methods strongly depends on dendritic morphology as well as the biophysical properties of the cell membrane. For real morphologies, the number of trips to guarantee a small convergence error might become very large and strongly affect computational efficiency. As an alternative, we introduce a highly-efficient matrix method which can be applied to arbitrary branching structures

    Statistical physics of neural systems with non-additive dendritic coupling

    Full text link
    How neurons process their inputs crucially determines the dynamics of biological and artificial neural networks. In such neural and neural-like systems, synaptic input is typically considered to be merely transmitted linearly or sublinearly by the dendritic compartments. Yet, single-neuron experiments report pronounced supralinear dendritic summation of sufficiently synchronous and spatially close-by inputs. Here, we provide a statistical physics approach to study the impact of such non-additive dendritic processing on single neuron responses and the performance of associative memory tasks in artificial neural networks. First, we compute the effect of random input to a neuron incorporating nonlinear dendrites. This approach is independent of the details of the neuronal dynamics. Second, we use those results to study the impact of dendritic nonlinearities on the network dynamics in a paradigmatic model for associative memory, both numerically and analytically. We find that dendritic nonlinearities maintain network convergence and increase the robustness of memory performance against noise. Interestingly, an intermediate number of dendritic branches is optimal for memory functionality

    The effect of noise in models of spiny dendrites

    Get PDF
    The dendritic tree provides the surface area for synaptic connections between the 100 billion neurons in the brain. 90% of excitatory synapses are made onto dendritic spines which are constantly changing shape and strength. This adaptation is believed to be an important factor in learning, memory and computations within the dendritic tree. The environment in which the neuron sits is inherently noisy due to the activity in nearby neurons and the stochastic nature of synaptic gating. Therefore the effects of noise is a very important aspect in any realistic model. This work provides a comprehensive study of two spiny dendrite models driven by different forms of noise in the spine dynamics or in the membrane voltage. We investigate the effect of the noise on signal propagation along the dendrite and how any correlation in the noise may affect this behaviour. We discover a difference in the results of the two models which suggests that the form of spine connectivity is important. We also show that both models have the capacity to act as a robust filter and that a branched structure can perform logic computations

    Dynamics of spatially extended dendrites

    Get PDF
    Dendrites are the most visually striking parts of neurons. Even so many neuron models are of point type and have no representation of space. In this thesis we will look at a range of neuronal models with the common property that we always include spatially extended dendrites. First we generalise Abbott’s “sum-over-trips” framework to include resonant currents. We also look at piece-wise linear (PWL) models and extend them to incorporate spatial structure in the form of dendrites. We look at the analytical construction of orbits for PWL models. By using both analytical and numerical Lyapunov exponent methods we explore phase space and in particular we look at mode-locked solutions. We will then construct the phase response curve (PRC) for a PWL system with compartmentally modelled dendrites. This sets us up so we can look at the effect of multiple PWL systems that are weakly coupled through gap junctions. We also attach a continuous dendrite to a PWL soma and investigate how the position of the gap junction influences network properties. After this we will present a short overview of neuronal plasticity with a special focus on the spatial effects. We also discuss attenuation of distal synaptic input and how this can be countered by dendritic democracy as this will become an integral part of our learning mechanisms. We will examine a number of different learning approaches including the tempotron and spike-time dependent plasticity. Here we will consider Poisson’s equation around a neural membrane. The membrane we focus on has Hodgkin-Huxley dynamics so we can study action potential propagation on the membrane. We present the Green’s function for the case of a one-dimensional membrane in a two-dimensional space. This will allow us to examine the action potential initiation and propagation in a multi-dimensional axon

    Asymmetric ephaptic inhibition between compartmentalized olfactory receptor neurons.

    Get PDF
    In the Drosophila antenna, different subtypes of olfactory receptor neurons (ORNs) housed in the same sensory hair (sensillum) can inhibit each other non-synaptically. However, the mechanisms underlying this underexplored form of lateral inhibition remain unclear. Here we use recordings from pairs of sensilla impaled by the same tungsten electrode to demonstrate that direct electrical ("ephaptic") interactions mediate lateral inhibition between ORNs. Intriguingly, within individual sensilla, we find that ephaptic lateral inhibition is asymmetric such that one ORN exerts greater influence onto its neighbor. Serial block-face scanning electron microscopy of genetically identified ORNs and circuit modeling indicate that asymmetric lateral inhibition reflects a surprisingly simple mechanism: the physically larger ORN in a pair corresponds to the dominant neuron in ephaptic interactions. Thus, morphometric differences between compartmentalized ORNs account for highly specialized inhibitory interactions that govern information processing at the earliest stages of olfactory coding
    corecore