7,556 research outputs found
The complexity of dynamics in small neural circuits
Mean-field theory is a powerful tool for studying large neural networks.
However, when the system is composed of a few neurons, macroscopic differences
between the mean-field approximation and the real behavior of the network can
arise. Here we introduce a study of the dynamics of a small firing-rate network
with excitatory and inhibitory populations, in terms of local and global
bifurcations of the neural activity. Our approach is analytically tractable in
many respects, and sheds new light on the finite-size effects of the system. In
particular, we focus on the formation of multiple branching solutions of the
neural equations through spontaneous symmetry-breaking, since this phenomenon
increases considerably the complexity of the dynamical behavior of the network.
For these reasons, branching points may reveal important mechanisms through
which neurons interact and process information, which are not accounted for by
the mean-field approximation.Comment: 34 pages, 11 figures. Supplementary materials added, colors of
figures 8 and 9 fixed, results unchange
Efficient Learning of a One-dimensional Density Functional Theory
Density functional theory underlies the most successful and widely used
numerical methods for electronic structure prediction of solids. However, it
has the fundamental shortcoming that the universal density functional is
unknown. In addition, the computational result---energy and charge density
distribution of the ground state---is useful for electronic properties of
solids mostly when reduced to a band structure interpretation based on the
Kohn-Sham approach. Here, we demonstrate how machine learning algorithms can
help to free density functional theory from these limitations. We study a
theory of spinless fermions on a one-dimensional lattice. The density
functional is implicitly represented by a neural network, which predicts,
besides the ground-state energy and density distribution, density-density
correlation functions. At no point do we require a band structure
interpretation. The training data, obtained via exact diagonalization, feeds
into a learning scheme inspired by active learning, which minimizes the
computational costs for data generation. We show that the network results are
of high quantitative accuracy and, despite learning on random potentials,
capture both symmetry-breaking and topological phase transitions correctly.Comment: 5 pages, 3 figures; 4+ pages appendi
A novel plasticity rule can explain the development of sensorimotor intelligence
Grounding autonomous behavior in the nervous system is a fundamental
challenge for neuroscience. In particular, the self-organized behavioral
development provides more questions than answers. Are there special functional
units for curiosity, motivation, and creativity? This paper argues that these
features can be grounded in synaptic plasticity itself, without requiring any
higher level constructs. We propose differential extrinsic plasticity (DEP) as
a new synaptic rule for self-learning systems and apply it to a number of
complex robotic systems as a test case. Without specifying any purpose or goal,
seemingly purposeful and adaptive behavior is developed, displaying a certain
level of sensorimotor intelligence. These surprising results require no system
specific modifications of the DEP rule but arise rather from the underlying
mechanism of spontaneous symmetry breaking due to the tight
brain-body-environment coupling. The new synaptic rule is biologically
plausible and it would be an interesting target for a neurobiolocal
investigation. We also argue that this neuronal mechanism may have been a
catalyst in natural evolution.Comment: 18 pages, 5 figures, 7 video
- …