1,315 research outputs found
GeNN: a code generation framework for accelerated brain simulations
Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ.
GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials,
Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/
Simulating quantum statistics with entangled photons: a continuous transition from bosons to fermions
In contrast to classical physics, quantum mechanics divides particles into
two classes-bosons and fermions-whose exchange statistics dictate the dynamics
of systems at a fundamental level. In two dimensions quasi-particles known as
'anyons' exhibit fractional exchange statistics intermediate between these two
classes. The ability to simulate and observe behaviour associated to
fundamentally different quantum particles is important for simulating complex
quantum systems. Here we use the symmetry and quantum correlations of entangled
photons subjected to multiple copies of a quantum process to directly simulate
quantum interference of fermions, bosons and a continuum of fractional
behaviour exhibited by anyons. We observe an average similarity of 93.6\pm0.2%
between an ideal model and experimental observation. The approach generalises
to an arbitrary number of particles and is independent of the statistics of the
particles used, indicating application with other quantum systems and large
scale application.Comment: 10 pages, 5 figure
The Eyes Have It: Sex and Sexual Orientation Differences in Pupil Dilation Patterns
Recent research suggests profound sex and sexual orientation differences in sexual response. These results, however, are based on measures of genital arousal, which have potential limitations such as volunteer bias and differential measures for the sexes. The present study introduces a measure less affected by these limitations. We assessed the pupil dilation of 325 men and women of various sexual orientations to male and female erotic stimuli. Results supported hypotheses. In general, self-reported sexual orientation corresponded with pupil dilation to men and women. Among men, substantial dilation to both sexes was most common in bisexual-identified men. In contrast, among women, substantial dilation to both sexes was most common in heterosexual-identified women. Possible reasons for these differences are discussed. Because the measure of pupil dilation is less invasive than previous measures of sexual response, it allows for studying diverse age and cultural populations, usually not included in sexuality research
Genetic and Environmental Influences on Female Sexual Orientation, Childhood Gender Typicality and Adult Gender Identity
Background: Human sexual orientation is influenced by genetic and non-shared environmental factors as are two important psychological correlates – childhood gender typicality (CGT) and adult gender identity (AGI). However, researchers have been unable to resolve the genetic and non-genetic components that contribute to the covariation between these traits, particularly in women. Methodology/Principal Findings: Here we performed a multivariate genetic analysis in a large sample of British female twins (N = 4,426) who completed a questionnaire assessing sexual attraction, CGT and AGI. Univariate genetic models indicated modest genetic influences on sexual attraction (25%), AGI (11%) and CGT (31%). For the multivariate analyses, a common pathway model best fitted the data. Conclusions/Significance: This indicated that a single latent variable influenced by a genetic component and common nonshared environmental component explained the association between the three traits but there was substantial measurement error. These findings highlight common developmental factors affecting differences in sexual orientation
Intrinsic gain modulation and adaptive neural coding
In many cases, the computation of a neural system can be reduced to a
receptive field, or a set of linear filters, and a thresholding function, or
gain curve, which determines the firing probability; this is known as a
linear/nonlinear model. In some forms of sensory adaptation, these linear
filters and gain curve adjust very rapidly to changes in the variance of a
randomly varying driving input. An apparently similar but previously unrelated
issue is the observation of gain control by background noise in cortical
neurons: the slope of the firing rate vs current (f-I) curve changes with the
variance of background random input. Here, we show a direct correspondence
between these two observations by relating variance-dependent changes in the
gain of f-I curves to characteristics of the changing empirical
linear/nonlinear model obtained by sampling. In the case that the underlying
system is fixed, we derive relationships relating the change of the gain with
respect to both mean and variance with the receptive fields derived from
reverse correlation on a white noise stimulus. Using two conductance-based
model neurons that display distinct gain modulation properties through a simple
change in parameters, we show that coding properties of both these models
quantitatively satisfy the predicted relationships. Our results describe how
both variance-dependent gain modulation and adaptive neural computation result
from intrinsic nonlinearity.Comment: 24 pages, 4 figures, 1 supporting informatio
A Comprehensive Workflow for General-Purpose Neural Modeling with Highly Configurable Neuromorphic Hardware Systems
In this paper we present a methodological framework that meets novel
requirements emerging from upcoming types of accelerated and highly
configurable neuromorphic hardware systems. We describe in detail a device with
45 million programmable and dynamic synapses that is currently under
development, and we sketch the conceptual challenges that arise from taking
this platform into operation. More specifically, we aim at the establishment of
this neuromorphic system as a flexible and neuroscientifically valuable
modeling tool that can be used by non-hardware-experts. We consider various
functional aspects to be crucial for this purpose, and we introduce a
consistent workflow with detailed descriptions of all involved modules that
implement the suggested steps: The integration of the hardware interface into
the simulator-independent model description language PyNN; a fully automated
translation between the PyNN domain and appropriate hardware configurations; an
executable specification of the future neuromorphic system that can be
seamlessly integrated into this biology-to-hardware mapping process as a test
bench for all software layers and possible hardware design modifications; an
evaluation scheme that deploys models from a dedicated benchmark library,
compares the results generated by virtual or prototype hardware devices with
reference software simulations and analyzes the differences. The integration of
these components into one hardware-software workflow provides an ecosystem for
ongoing preparative studies that support the hardware design process and
represents the basis for the maturity of the model-to-hardware mapping
software. The functionality and flexibility of the latter is proven with a
variety of experimental results
NeuroML: A Language for Describing Data Driven Models of Neurons and Networks with a High Degree of Biological Detail
Biologically detailed single neuron and network models are important for understanding how ion channels, synapses and anatomical connectivity underlie the complex electrical behavior of the brain. While neuronal simulators such as NEURON, GENESIS, MOOSE, NEST, and PSICS facilitate the development of these data-driven neuronal models, the specialized languages they employ are generally not interoperable, limiting model accessibility and preventing reuse of model components and cross-simulator validation. To overcome these problems we have used an Open Source software approach to develop NeuroML, a neuronal model description language based on XML (Extensible Markup Language). This enables these detailed models and their components to be defined in a standalone form, allowing them to be used across multiple simulators and archived in a standardized format. Here we describe the structure of NeuroML and demonstrate its scope by converting into NeuroML models of a number of different voltage- and ligand-gated conductances, models of electrical coupling, synaptic transmission and short-term plasticity, together with morphologically detailed models of individual neurons. We have also used these NeuroML-based components to develop an highly detailed cortical network model. NeuroML-based model descriptions were validated by demonstrating similar model behavior across five independently developed simulators. Although our results confirm that simulations run on different simulators converge, they reveal limits to model interoperability, by showing that for some models convergence only occurs at high levels of spatial and temporal discretisation, when the computational overhead is high. Our development of NeuroML as a common description language for biophysically detailed neuronal and network models enables interoperability across multiple simulation environments, thereby improving model transparency, accessibility and reuse in computational neuroscience
Role of mid-gap states in charge transport and photoconductivity in semiconductor nanocrystal films
Colloidal semiconductor nanocrystals have attracted significant interest for applications in solution-processable devices such as light-emitting diodes and solar cells. However, a poor understanding of charge transport in nanocrystal assemblies, specifically the relation between electrical conductance in dark and under light illumination, hinders their technological applicability. Here we simultaneously address the issues of 'dark' transport and photoconductivity in films of PbS nanocrystals, by incorporating them into optical field-effect transistors in which the channel conductance is controlled by both gate voltage and incident radiation. Spectrally resolved photoresponses of these devices reveal a weakly conductive mid-gap band that is responsible for charge transport in dark. The mechanism for conductance, however, changes under illumination when it becomes dominated by band-edge quantized states. In this case, the mid-gap band still has an important role as its occupancy (tuned by the gate voltage) controls the dynamics of band-edge charges
The what and where of adding channel noise to the Hodgkin-Huxley equations
One of the most celebrated successes in computational biology is the
Hodgkin-Huxley framework for modeling electrically active cells. This
framework, expressed through a set of differential equations, synthesizes the
impact of ionic currents on a cell's voltage -- and the highly nonlinear impact
of that voltage back on the currents themselves -- into the rapid push and pull
of the action potential. Latter studies confirmed that these cellular dynamics
are orchestrated by individual ion channels, whose conformational changes
regulate the conductance of each ionic current. Thus, kinetic equations
familiar from physical chemistry are the natural setting for describing
conductances; for small-to-moderate numbers of channels, these will predict
fluctuations in conductances and stochasticity in the resulting action
potentials. At first glance, the kinetic equations provide a far more complex
(and higher-dimensional) description than the original Hodgkin-Huxley
equations. This has prompted more than a decade of efforts to capture channel
fluctuations with noise terms added to the Hodgkin-Huxley equations. Many of
these approaches, while intuitively appealing, produce quantitative errors when
compared to kinetic equations; others, as only very recently demonstrated, are
both accurate and relatively simple. We review what works, what doesn't, and
why, seeking to build a bridge to well-established results for the
deterministic Hodgkin-Huxley equations. As such, we hope that this review will
speed emerging studies of how channel noise modulates electrophysiological
dynamics and function. We supply user-friendly Matlab simulation code of these
stochastic versions of the Hodgkin-Huxley equations on the ModelDB website
(accession number 138950) and
http://www.amath.washington.edu/~etsb/tutorials.html.Comment: 14 pages, 3 figures, review articl
Central synapses release a resource-efficient amount of glutamate.
Why synapses release a certain amount of neurotransmitter is poorly understood. We combined patch-clamp electrophysiology with computer simulations to estimate how much glutamate is discharged at two distinct central synapses of the rat. We found that, regardless of some uncertainty over synaptic microenvironment, synapses generate the maximal current per released glutamate molecule while maximizing signal information content. Our result suggests that synapses operate on a principle of resource optimization
- …