4,277 research outputs found
What is the functional role of adult neurogenesis in the hippocampus?
The dentate gyrus is part of the hippocampal memory system and special in
that it generates new neurons throughout life. Here we discuss the
question of what the functional role of these new neurons might be. Our
hypothesis is that they help the dentate gyrus to avoid the problem of
catastrophic interference when adapting to new environments. We assume
that old neurons are rather stable and preserve an optimal encoding
learned for known environments while new neurons are plastic to adapt to
those features that are qualitatively new in a new environment. A simple
network simulation demonstrates that adding new plastic neurons is indeed
a successful strategy for adaptation without catastrophic interference
Rhythmic Representations: Learning Periodic Patterns for Scalable Place Recognition at a Sub-Linear Storage Cost
Robotic and animal mapping systems share many challenges and characteristics:
they must function in a wide variety of environmental conditions, enable the
robot or animal to navigate effectively to find food or shelter, and be
computationally tractable from both a speed and storage perspective. With
regards to map storage, the mammalian brain appears to take a diametrically
opposed approach to all current robotic mapping systems. Where robotic mapping
systems attempt to solve the data association problem to minimise
representational aliasing, neurons in the brain intentionally break data
association by encoding large (potentially unlimited) numbers of places with a
single neuron. In this paper, we propose a novel method based on supervised
learning techniques that seeks out regularly repeating visual patterns in the
environment with mutually complementary co-prime frequencies, and an encoding
scheme that enables storage requirements to grow sub-linearly with the size of
the environment being mapped. To improve robustness in challenging real-world
environments while maintaining storage growth sub-linearity, we incorporate
both multi-exemplar learning and data augmentation techniques. Using large
benchmark robotic mapping datasets, we demonstrate the combined system
achieving high-performance place recognition with sub-linear storage
requirements, and characterize the performance-storage growth trade-off curve.
The work serves as the first robotic mapping system with sub-linear storage
scaling properties, as well as the first large-scale demonstration in
real-world environments of one of the proposed memory benefits of these
neurons.Comment: Pre-print of article that will appear in the IEEE Robotics and
Automation Letter
Memristors -- from In-memory computing, Deep Learning Acceleration, Spiking Neural Networks, to the Future of Neuromorphic and Bio-inspired Computing
Machine learning, particularly in the form of deep learning, has driven most
of the recent fundamental developments in artificial intelligence. Deep
learning is based on computational models that are, to a certain extent,
bio-inspired, as they rely on networks of connected simple computing units
operating in parallel. Deep learning has been successfully applied in areas
such as object/pattern recognition, speech and natural language processing,
self-driving vehicles, intelligent self-diagnostics tools, autonomous robots,
knowledgeable personal assistants, and monitoring. These successes have been
mostly supported by three factors: availability of vast amounts of data,
continuous growth in computing power, and algorithmic innovations. The
approaching demise of Moore's law, and the consequent expected modest
improvements in computing power that can be achieved by scaling, raise the
question of whether the described progress will be slowed or halted due to
hardware limitations. This paper reviews the case for a novel beyond CMOS
hardware technology, memristors, as a potential solution for the implementation
of power-efficient in-memory computing, deep learning accelerators, and spiking
neural networks. Central themes are the reliance on non-von-Neumann computing
architectures and the need for developing tailored learning and inference
algorithms. To argue that lessons from biology can be useful in providing
directions for further progress in artificial intelligence, we briefly discuss
an example based reservoir computing. We conclude the review by speculating on
the big picture view of future neuromorphic and brain-inspired computing
systems.Comment: Keywords: memristor, neuromorphic, AI, deep learning, spiking neural
networks, in-memory computin
Real-time encoding and compression of neuronal spikes by metal-oxide memristors
Advanced brain-chip interfaces with numerous recording sites bear great potential for investigation of neuroprosthetic applications. The bottleneck towards achieving an efficient bio-electronic link is the real-time processing of neuronal signals, which imposes excessive requirements on bandwidth, energy and computation capacity. Here we present a unique concept where the intrinsic properties of memristive devices are exploited to compress information on neural spikes in real-time. We demonstrate that the inherent voltage thresholds of metal-oxide memristors can be used for discriminating recorded spiking events from background activity and without resorting to computationally heavy off-line processing. We prove that information on spike amplitude and frequency can be transduced and stored in single devices as non-volatile resistive state transitions. Finally, we show that a memristive device array allows for efficient data compression of signals recorded by a multi-electrode array, demonstrating the technology’s potential for building scalable, yet energy-efficient on-node processors for brain-chip interfaces
- …