31 research outputs found
Using Conceptors to Transfer Between Long-Term and Short-Term Memory
International audienceWe introduce a model of working memory combining short-term and long-term components. For the long-term component, we used Concep-tors in order to store constant temporal patterns. For the short-term component , we used the Gated-Reservoir model: a reservoir trained to hold a triggered information from an input stream and maintain it in a readout unit. We combined both components in order to obtain a model in which information can go from long-term memory to short-term memory and vice-versa
Recommended from our members
Sample-level sound synthesis with recurrent neural networks and conceptors
Conceptors are a recent development in the field of reservoir computing; they can be used to influence the dynamics of recurrent neural networks (RNNs), enabling generation of arbitrary patterns based on training data. Conceptors allow interpolation and extrapolation between patterns, and also provide a system of boolean logic for combining patterns together. Generation and manipulation of arbitrary patterns using conceptors has significant potential as a sound synthesis method for applications in computer music but has yet to be explored. Conceptors are untested with the generation of multi-timbre audio patterns, and little testing has been done on scalability to longer patterns required for audio. A novel method of sound synthesis based on conceptors is introduced. Conceptular Synthesis is based on granular synthesis; sets of conceptors are trained to recall varying patterns from a single RNN, then a runtime mechanism switches between them, generating short patterns which are recombined into a longer sound. The quality of sound resynthesis using this technique is experimentally evaluated. Conceptor models are shown to resynthesise audio with a comparable quality to a close equivalent technique using echo state networks with stored patterns and output feedback. Conceptor models are also shown to excel in their malleability and potential for creative sound manipulation, in comparison to echo state network models which tend to fail when the same manipulations are applied. Examples are given demonstrating creative sonic possibilities, by exploiting conceptor pattern morphing, boolean conceptor logic and manipulation of RNN dynamics. Limitations of conceptor models are revealed with regards to reproduction quality, and pragmatic limitations are also shown, where rises in computation and memory requirements preclude the use of these models for training with longer sound samples. The techniques presented here represent an initial exploration of the sound synthesis potential of conceptors, demonstrating possible creative applications in sound design; future possibilities and research questions are outlined
Stability and Memory-loss go Hand-in-Hand: Three Results in Dynamics & Computation
The search for universal laws that help establish a relationship between
dynamics and computation is driven by recent expansionist initiatives in
biologically inspired computing. A general setting to understand both such
dynamics and computation is a driven dynamical system that responds to a
temporal input. Surprisingly, we find memory-loss a feature of driven systems
to forget their internal states helps provide unambiguous answers to the
following fundamental stability questions that have been unanswered for
decades: what is necessary and sufficient so that slightly different inputs
still lead to mostly similar responses? How does changing the driven system's
parameters affect stability? What is the mathematical definition of the
edge-of-criticality? We anticipate our results to be timely in understanding
and designing biologically inspired computers that are entering an era of
dedicated hardware implementations for neuromorphic computing and
state-of-the-art reservoir computing applications.Comment: To appear in the Proceedings of the Royal Society of London, Series
Change Point Detection with Conceptors
Offline change point detection retrospectively locates change points in a
time series. Many nonparametric methods that target i.i.d. mean and variance
changes fail in the presence of nonlinear temporal dependence, and model based
methods require a known, rigid structure. For the at most one change point
problem, we propose use of a conceptor matrix to learn the characteristic
dynamics of a baseline training window with arbitrary dependence structure. The
associated echo state network acts as a featurizer of the data, and change
points are identified from the nature of the interactions between the features
and their relationship to the baseline state. This model agnostic method can
suggest potential locations of interest that warrant further study. We prove
that, under mild assumptions, the method provides a consistent estimate of the
true change point, and quantile estimates are produced via a moving block
bootstrap of the original data. The method is evaluated with clustering metrics
and Type 1 error control on simulated data, and applied to publicly available
neural data from rats experiencing bouts of non-REM sleep prior to exploration
of a radial maze. With sufficient spacing, the framework provides a simple
extension to the sparse, multiple change point problem