113 research outputs found

    The Asymptotic Performance of Linear Echo State Neural Networks

    Get PDF
    In this article, a study of the mean-square error (MSE) performance of linear echo-state neural networks is performed, both for training and testing tasks. Considering the realistic setting of noise present at the network nodes, we derive deterministic equivalents for the aforementioned MSE in the limit where the number of input data TT and network size nn both grow large. Specializing then the network connectivity matrix to specific random settings, we further obtain simple formulas that provide new insights on the performance of such networks

    A Tutorial on the Spectral Theory of Markov Chains

    Full text link
    Markov chains are a class of probabilistic models that have achieved widespread application in the quantitative sciences. This is in part due to their versatility, but is compounded by the ease with which they can be probed analytically. This tutorial provides an in-depth introduction to Markov chains, and explores their connection to graphs and random walks. We utilize tools from linear algebra and graph theory to describe the transition matrices of different types of Markov chains, with a particular focus on exploring properties of the eigenvalues and eigenvectors corresponding to these matrices. The results presented are relevant to a number of methods in machine learning and data mining, which we describe at various stages. Rather than being a novel academic study in its own right, this text presents a collection of known results, together with some new concepts. Moreover, the tutorial focuses on offering intuition to readers rather than formal understanding, and only assumes basic exposure to concepts from linear algebra and probability theory. It is therefore accessible to students and researchers from a wide variety of disciplines

    Learning strange attractors with reservoir systems

    Get PDF
    This paper shows that the celebrated embedding theorem of Takens is a particular case of a much more general statement according to which, randomly generated linear state-space representations of generic observations of an invertible dynamical system carry in their wake an embedding of the phase space dynamics into the chosen Euclidean state space. This embedding coincides with a natural generalized synchronization that arises in this setup and that yields a topological conjugacy between the state-space dynamics driven by the generic observations of the dynamical system and the dynamical system itself. This result provides additional tools for the representation, learning, and analysis of chaotic attractors and sheds additional light on the reservoir computing phenomenon that appears in the context of recurrent neural networks.</p

    Neuronal dynamics across macroscopic timescales

    Get PDF
    The brain operates in a world with rich dynamics across a wide range of timescales, including those on the order of seconds and above. Behavioral experiments on memory and timing reveal striking similarities in the behavioral patterns across a range of timescales from seconds to minutes. To subserve these behavioral patterns and adapt to natural statistics, the collective activity of the large of number of neurons in the brain should exhibit dynamics over these macroscopic timescales as well. Most established results in systems neuroscience concern the short-term responses of single neurons to static features of the world. Recently, new techniques for large-scale and chronic measurements of neural activity open up the opportunity to investigate neural dynamics across different macroscopic timescales. This dissertation presents work that reveals the temporal patterns of neural activity across a range of macroscopic timescales and explores their mechanistic basis. Chapter 1 briefly surveys the relevant empirical evidence, biophysical processes and modeling techniques. Chapter 2 presents a biophysically-realistic neural circuit model that combines a detailed simulation of a calcium-activated membrane current with the mathematical formalism of the inverse Laplace transform to produce sequential neural activity with a scale-invariant property. Chapter 3 is a theoretical analysis of the ability of linear recurrent neural networks to generate scale-invariant neural activity. It is shown that the network connectivity matrix should have a geometric series of eigenvalues and translated eigenvectors if the eigenvalues are real and distinct. Chapter 4 presents an empirical analysis of neural data motivated by the hypothesis that robust neural dynamics should simultaneously exist on multiple timescales. The analysis reveals the existence of repeatable neural dynamics on the timescale of both seconds and minutes in multiple neural recordings of rodents performing various cognitive tasks. Chapter 5 of the dissertation presents an initial effort to characterize the changes in the neural population activity during learning on the timescale of tens of minutes by analyzing neural recordings from monkeys while they learn associations between visual stimuli

    Learning strange attractors with reservoir systems

    Get PDF
    This paper shows that the celebrated embedding theorem of Takens is a particular case of a much more general statement according to which, randomly generated linear state-space representations of generic observations of an invertible dynamical system carry in their wake an embedding of the phase space dynamics into the chosen Euclidean state space. This embedding coincides with a natural generalized synchronization that arises in this setup and that yields a topological conjugacy between the state-space dynamics driven by the generic observations of the dynamical system and the dynamical system itself. This result provides additional tools for the representation, learning, and analysis of chaotic attractors and sheds additional light on the reservoir computing phenomenon that appears in the context of recurrent neural networks.</p

    Learning strange attractors with reservoir systems

    Full text link
    This paper shows that the celebrated Embedding Theorem of Takens is a particular case of a much more general statement according to which, randomly generated linear state-space representations of generic observations of an invertible dynamical system carry in their wake an embedding of the phase space dynamics into the chosen Euclidean state space. This embedding coincides with a natural generalized synchronization that arises in this setup and that yields a topological conjugacy between the state-space dynamics driven by the generic observations of the dynamical system and the dynamical system itself. This result provides additional tools for the representation, learning, and analysis of chaotic attractors and sheds additional light on the reservoir computing phenomenon that appears in the context of recurrent neural networks.Comment: 36 pages, 11 figure
    corecore