24,731 research outputs found
Dopamine-modulated dynamic cell assemblies generated by the GABAergic striatal microcircuit
The striatum, the principal input structure of the basal ganglia, is crucial to both motor control and learning. It receives convergent input from all over the neocortex, hippocampal formation, amygdala and thalamus, and is the primary recipient of dopamine in the brain. Within the striatum is a GABAergic microcircuit that acts upon these inputs, formed by the dominant medium-spiny projection neurons (MSNs) and fast-spiking interneurons (FSIs). There has been little progress in understanding the computations it performs, hampered by the non-laminar structure that prevents identification of a repeating canonical microcircuit. We here begin the identification of potential dynamically-defined computational elements within the striatum. We construct a new three-dimensional model of the striatal microcircuit's connectivity, and instantiate this with our dopamine-modulated neuron models of the MSNs and FSIs. A new model of gap junctions between the FSIs is introduced and tuned to experimental data. We introduce a novel multiple spike-train analysis method, and apply this to the outputs of the model to find groups of synchronised neurons at multiple time-scales. We find that, with realistic in vivo background input, small assemblies of synchronised MSNs spontaneously appear, consistent with experimental observations, and that the number of assemblies and the time-scale of synchronisation is strongly dependent on the simulated concentration of dopamine. We also show that feed-forward inhibition from the FSIs counter-intuitively increases the firing rate of the MSNs. Such small cell assemblies forming spontaneously only in the absence of dopamine may contribute to motor control problems seen in humans and animals following a loss of dopamine cells. (C) 2009 Elsevier Ltd. All rights reserved
Decoupling of brain function from structure reveals regional behavioral specialization in humans
The brain is an assembly of neuronal populations interconnected by structural
pathways. Brain activity is expressed on and constrained by this substrate.
Therefore, statistical dependencies between functional signals in directly
connected areas can be expected higher. However, the degree to which brain
function is bound by the underlying wiring diagram remains a complex question
that has been only partially answered. Here, we introduce the
structural-decoupling index to quantify the coupling strength between structure
and function, and we reveal a macroscale gradient from brain regions more
strongly coupled, to regions more strongly decoupled, than expected by
realistic surrogate data. This gradient spans behavioral domains from
lower-level sensory function to high-level cognitive ones and shows for the
first time that the strength of structure-function coupling is spatially
varying in line with evidence derived from other modalities, such as functional
connectivity, gene expression, microstructural properties and temporal
hierarchy
Balanced Quantization: An Effective and Efficient Approach to Quantized Neural Networks
Quantized Neural Networks (QNNs), which use low bitwidth numbers for
representing parameters and performing computations, have been proposed to
reduce the computation complexity, storage size and memory usage. In QNNs,
parameters and activations are uniformly quantized, such that the
multiplications and additions can be accelerated by bitwise operations.
However, distributions of parameters in Neural Networks are often imbalanced,
such that the uniform quantization determined from extremal values may under
utilize available bitwidth. In this paper, we propose a novel quantization
method that can ensure the balance of distributions of quantized values. Our
method first recursively partitions the parameters by percentiles into balanced
bins, and then applies uniform quantization. We also introduce computationally
cheaper approximations of percentiles to reduce the computation overhead
introduced. Overall, our method improves the prediction accuracies of QNNs
without introducing extra computation during inference, has negligible impact
on training speed, and is applicable to both Convolutional Neural Networks and
Recurrent Neural Networks. Experiments on standard datasets including ImageNet
and Penn Treebank confirm the effectiveness of our method. On ImageNet, the
top-5 error rate of our 4-bit quantized GoogLeNet model is 12.7\%, which is
superior to the state-of-the-arts of QNNs
Graph Convolutional Networks (GCNs) for Molecular Property Prediction in Drug Development
Molecular property prediction is key to drug development. The rising of deep learning techniques provides new possibilities to learn the molecular properties directly from chemical data. In particular, graph convolutional networks have been introduced into the field and made significant enhancements compared to traditional methods. The first part of this paper serves as a study to explore and evaluate this emerging method while the second part demonstrates that graph convolution networks can be further improved by incorporating attention mechanism, another influential deep learning idea.No embargoAcademic Major: Computer and Information Scienc
Activity Recognition and Prediction in Real Homes
In this paper, we present work in progress on activity recognition and
prediction in real homes using either binary sensor data or depth video data.
We present our field trial and set-up for collecting and storing the data, our
methods, and our current results. We compare the accuracy of predicting the
next binary sensor event using probabilistic methods and Long Short-Term Memory
(LSTM) networks, include the time information to improve prediction accuracy,
as well as predict both the next sensor event and its mean time of occurrence
using one LSTM model. We investigate transfer learning between apartments and
show that it is possible to pre-train the model with data from other apartments
and achieve good accuracy in a new apartment straight away. In addition, we
present preliminary results from activity recognition using low-resolution
depth video data from seven apartments, and classify four activities - no
movement, standing up, sitting down, and TV interaction - by using a relatively
simple processing method where we apply an Infinite Impulse Response (IIR)
filter to extract movements from the frames prior to feeding them to a
convolutional LSTM network for the classification.Comment: 12 pages, Symposium of the Norwegian AI Society NAIS 201
- …