731 research outputs found
Birhythmic Analog Circuit Maze: A Nonlinear Neurostimulation Testbed
Brain dynamics can exhibit narrow-band nonlinear oscillations and
multistability. For a subset of disorders of consciousness and motor control,
we hypothesize that some symptoms originate from the inability to spontaneously
transition from one attractor to another. Using external perturbations, such as
electrical pulses delivered by deep brain stimulation devices, it may be
possible to induce such transition out of the pathological attractors. However,
the induction of transition may be non-trivial, rendering the current open-loop
stimulation strategies insufficient. In order to develop next-generation neural
stimulators that can intelligently learn to induce attractor transitions, we
require a platform to test the efficacy of such systems. To this end, we
designed an analog circuit as a model for the multistable brain dynamics. The
circuit spontaneously oscillates stably on two periods as an instantiation of a
3-dimensional continuous-time gated recurrent neural network. To discourage
simple perturbation strategies such as constant or random stimulation patterns
from easily inducing transition between the stable limit cycles, we designed a
state-dependent nonlinear circuit interface for external perturbation. We
demonstrate the existence of nontrivial solutions to the transition problem in
our circuit implementation
Trainability, Expressivity and Interpretability in Gated Neural ODEs
Understanding how the dynamics in biological and artificial neural networks
implement the computations required for a task is a salient open question in
machine learning and neuroscience. In particular, computations requiring
complex memory storage and retrieval pose a significant challenge for these
networks to implement or learn. Recently, a family of models described by
neural ordinary differential equations (nODEs) has emerged as powerful
dynamical neural network models capable of capturing complex dynamics. Here, we
extend nODEs by endowing them with adaptive timescales using gating
interactions. We refer to these as gated neural ODEs (gnODEs). Using a task
that requires memory of continuous quantities, we demonstrate the inductive
bias of the gnODEs to learn (approximate) continuous attractors. We further
show how reduced-dimensional gnODEs retain their modeling power while greatly
improving interpretability, even allowing explicit visualization of the
structure of learned attractors. We introduce a novel measure of expressivity
which probes the capacity of a neural network to generate complex trajectories.
Using this measure, we explore how the phase-space dimension of the nODEs and
the complexity of the function modeling the flow field contribute to
expressivity. We see that a more complex function for modeling the flow field
allows a lower-dimensional nODE to capture a given target dynamics. Finally, we
demonstrate the benefit of gating in nODEs on several real-world tasks
NAS-X: Neural Adaptive Smoothing via Twisting
We present Neural Adaptive Smoothing via Twisting (NAS-X), a method for
learning and inference in sequential latent variable models based on reweighted
wake-sleep (RWS). NAS-X works with both discrete and continuous latent
variables, and leverages smoothing SMC to fit a broader range of models than
traditional RWS methods. We test NAS-X on discrete and continuous tasks and
find that it substantially outperforms previous variational and RWS-based
methods in inference and parameter recovery
- …