2 research outputs found

    Parallel scalable simulations of biological neural networks using TensorFlow: A beginner's guide

    No full text
    Biological neural networks are often modeled as systems of coupled, nonlinear, ordinary or partial differential equations. The number of differential equations used to model a network increases with the size of the network and the level of detail used to model individual neurons and synapses. As one scales up the size of the simulation, it becomes essential to utilize powerful computing platforms. While many tools exist that solve these equations numerically, they are often platform-specific. Further, there is a high barrier of entry to developing flexible platform-independent general-purpose code that supports hardware acceleration on modern computing architectures such as GPUs/TPUs and Distributed Platforms. TensorFlow is a Python-based open-source package designed for machine learning algorithms. However, it is also a scalable environment for a variety of computations, including solving differential equations using iterative algorithms such as Runge-Kutta methods. In this article and the accompanying tutorials, we present a simple exposition of numerical methods to solve ordinary differential equations using Python and TensorFlow. The tutorials consist of a series of Python notebooks that, over the course of five sessions, will lead novice programmers from writing programs to integrate simple one-dimensional ordinary differential equations using Python to solving a large system (1000's of differential equations) of coupled conductance-based neurons using a highly parallelized and scalable framework. Embedded with the tutorial is a physiologically realistic implementation of a network in the insect olfactory system. This system, consisting of multiple neuron and synapse types, can serve as a template to simulate other networks.</p

    Invariant neural representations of fluctuating odor inputs

    No full text
    Steady odor streams are typically encoded as robust spatiotemporal spike trains by olfactory networks. This suggests a one-to-one mapping between the stimulus (an odor mixed in a steady stream of air) and its representation (a spatiotemporal pattern of spikes in a population of neurons) in the brain. Such a one-to-one mapping between an odor and a spatiotemporal pattern is unlikely to be accurate since natural odor stimuli change unpredictably over time. Odors arrive riding upon chaotically pulsed plumes of air and show unpredictable variations in concentration and in the composition of odorant molecules. These temporal changes often vary over time scales that are similar to the time scales of neural events thought to play a role in odor recognition. In the absence of such temporal variations, animals are known to inject intermittency while sampling the odor, suggesting that intermittent inputs might be a ‘feature’, not a ‘bug’. Here, we attempt to find the neural invariants of stable olfactory percepts using a computational model of the locust antennal lobe, the insect equivalent of the olfactory bulb in mammals. We show that when time-varying odor inputs intermittently perturb subsets of neurons in the antennal lobe network, the activity of the network reverberates in a manner that depends on both the nature of the inputs it receives and the structure of the neuronal sub-network that these inputs stimulate. We demonstrate that it is possible to decipher the structure of the perturbed sub-network by examining transient synchrony in the activity of the neurons. The ability to reconstruct the sub-network structure is vastly improved when odor inputs arrive or are sampled in an intermittent manner. Thus, the structure of the stimulated sub-network itself serves as a unique invariant code that represents the odor. Recent studies have shown that the response of individual projection neurons in the antennal lobe to a particular odor can be approximated using an odor-specific response kernel convolved with the temporal profile of the odor input. The parameters defining this kernel remain invariant to temporal changes in the input profile. Our simulations show that this invariance is inherited from the network structure.</p
    corecore