2,593 research outputs found

    Memristors for the Curious Outsiders

    Full text link
    We present both an overview and a perspective of recent experimental advances and proposed new approaches to performing computation using memristors. A memristor is a 2-terminal passive component with a dynamic resistance depending on an internal parameter. We provide an brief historical introduction, as well as an overview over the physical mechanism that lead to memristive behavior. This review is meant to guide nonpractitioners in the field of memristive circuits and their connection to machine learning and neural computation.Comment: Perpective paper for MDPI Technologies; 43 page

    The Neural Particle Filter

    Get PDF
    The robust estimation of dynamically changing features, such as the position of prey, is one of the hallmarks of perception. On an abstract, algorithmic level, nonlinear Bayesian filtering, i.e. the estimation of temporally changing signals based on the history of observations, provides a mathematical framework for dynamic perception in real time. Since the general, nonlinear filtering problem is analytically intractable, particle filters are considered among the most powerful approaches to approximating the solution numerically. Yet, these algorithms prevalently rely on importance weights, and thus it remains an unresolved question how the brain could implement such an inference strategy with a neuronal population. Here, we propose the Neural Particle Filter (NPF), a weight-less particle filter that can be interpreted as the neuronal dynamics of a recurrently connected neural network that receives feed-forward input from sensory neurons and represents the posterior probability distribution in terms of samples. Specifically, this algorithm bridges the gap between the computational task of online state estimation and an implementation that allows networks of neurons in the brain to perform nonlinear Bayesian filtering. The model captures not only the properties of temporal and multisensory integration according to Bayesian statistics, but also allows online learning with a maximum likelihood approach. With an example from multisensory integration, we demonstrate that the numerical performance of the model is adequate to account for both filtering and identification problems. Due to the weightless approach, our algorithm alleviates the 'curse of dimensionality' and thus outperforms conventional, weighted particle filters in higher dimensions for a limited number of particles

    Statistical Physics and Representations in Real and Artificial Neural Networks

    Full text link
    This document presents the material of two lectures on statistical physics and neural representations, delivered by one of us (R.M.) at the Fundamental Problems in Statistical Physics XIV summer school in July 2017. In a first part, we consider the neural representations of space (maps) in the hippocampus. We introduce an extension of the Hopfield model, able to store multiple spatial maps as continuous, finite-dimensional attractors. The phase diagram and dynamical properties of the model are analyzed. We then show how spatial representations can be dynamically decoded using an effective Ising model capturing the correlation structure in the neural data, and compare applications to data obtained from hippocampal multi-electrode recordings and by (sub)sampling our attractor model. In a second part, we focus on the problem of learning data representations in machine learning, in particular with artificial neural networks. We start by introducing data representations through some illustrations. We then analyze two important algorithms, Principal Component Analysis and Restricted Boltzmann Machines, with tools from statistical physics

    Isochronal synchrony and bidirectional communication with delay-coupled nonlinear oscillators

    Full text link
    We propose a basic mechanism for isochronal synchrony and communication with mutually delay-coupled chaotic systems. We show that two Ikeda ring oscillators (IROs), mutually coupled with a propagation delay, synchronize isochronally when both are symmetrically driven by a third Ikeda oscillator. This synchronous operation, unstable in the two delay-coupled oscillators alone, facilitates simultaneous, bidirectional communication of messages with chaotic carrier waveforms. This approach to combine both bidirectional and unidirectional coupling represents an application of generalized synchronization using a mediating drive signal for a spatially distributed and internally synchronized multi-component system

    Oscillatory dynamics with applications to cognitive tasks

    Get PDF
    Oscillations are ubiquitous in the brain and robustly correlate with distinct cognitive tasks. A specific type of oscillatory signals allows robust switching between states in networks involved in memorizing tasks. In particular, slow oscillations lead to an activation of the neuronal populations whereas oscillations in the beta range are effective in clearing the memory states. In this master thesis, previous works are revisited in order to provide a detailed analysis of the mechanisms underlying the states switching and their dependence on the network parameters. The model studied is a macroscopic description of the network recently derived due to mean-field theory advances. The role of spiking synchrony in the switching off of the active states is identified by means of bifurcation analysis and the study of the fixed points under the stroboscopic map. Finally, we propose an application of the effect of oscillations in a context of working memory
    corecore