443 research outputs found
NASA JSC neural network survey results
A survey of Artificial Neural Systems in support of NASA's (Johnson Space Center) Automatic Perception for Mission Planning and Flight Control Research Program was conducted. Several of the world's leading researchers contributed papers containing their most recent results on artificial neural systems. These papers were broken into categories and descriptive accounts of the results make up a large part of this report. Also included is material on sources of information on artificial neural systems such as books, technical reports, software tools, etc
On the design of dynamic associative neural memories
Cataloged from PDF version of article.We consider the design problem for a class of
discrete-time and continuous-time neural networks. We obtain
a characterization of all connection weights that store a given set
of vectors into the network; that is, each given vector becomes an
equilibrium point of the network. We also give sufficient conditions
that guarantee the asymptotic stability of these equilibrium
points
A Chaotic Associative Memory
We propose a novel Chaotic Associative Memory model using a network of
chaotic Rossler systems and investigate the storage capacity and retrieval
capabilities of this model as a function of increasing periodicity and chaos.
In early models of associate memory networks, memories were modeled as fixed
points, which may be mathematically convenient but has poor neurobiological
plausibility. Since brain dynamics is inherently oscillatory, attempts have
been made to construct associative memories using nonlinear oscillatory
networks. However, oscillatory associative memories are plagued by the problem
of poor storage capacity, though efforts have been made to improve capacity by
adding higher order oscillatory modes. The chaotic associative memory proposed
here exploits the continuous spectrum of chaotic elements and has higher
storage capacity than previously described oscillatory associate memories.Comment: 10 pages, 8 Figures, Submitted to "Chaos: An Interdisciplinary
Journal of Nonlinear Science
Neural network computation by in vitro transcriptional circuits
The structural similarity of neural networks and genetic regulatory networks
to digital circuits, and hence to each other, was noted from the
very beginning of their study [1, 2]. In this work, we propose a simple
biochemical system whose architecture mimics that of genetic regulation
and whose components allow for in vitro implementation of arbitrary
circuits. We use only two enzymes in addition to DNA and RNA
molecules: RNA polymerase (RNAP) and ribonuclease (RNase). We
develop a rate equation for in vitro transcriptional networks, and derive
a correspondence with general neural network rate equations [3].
As proof-of-principle demonstrations, an associative memory task and a
feedforward network computation are shown by simulation. A difference
between the neural network and biochemical models is also highlighted:
global coupling of rate equations through enzyme saturation can lead
to global feedback regulation, thus allowing a simple network without
explicit mutual inhibition to perform the winner-take-all computation.
Thus, the full complexity of the cell is not necessary for biochemical
computation: a wide range of functional behaviors can be achieved with
a small set of biochemical components
Meta-stable states in the hierarchical Dyson model drive parallel processing in the hierarchical Hopfield network
In this paper we introduce and investigate the statistical mechanics of
hierarchical neural networks: First, we approach these systems \`a la Mattis,
by thinking at the Dyson model as a single-pattern hierarchical neural network
and we discuss the stability of different retrievable states as predicted by
the related self-consistencies obtained from a mean-field bound and from a
bound that bypasses the mean-field limitation. The latter is worked out by
properly reabsorbing fluctuations of the magnetization related to higher levels
of the hierarchy into effective fields for the lower levels. Remarkably, mixing
Amit's ansatz technique (to select candidate retrievable states) with the
interpolation procedure (to solve for the free energy of these states) we prove
that (due to gauge symmetry) the Dyson model accomplishes both serial and
parallel processing. One step forward, we extend this scenario toward multiple
stored patterns by implementing the Hebb prescription for learning within the
couplings. This results in an Hopfield-like networks constrained on a
hierarchical topology, for which, restricting to the low storage regime (where
the number of patterns grows at most logarithmical with the amount of neurons),
we prove the existence of the thermodynamic limit for the free energy and we
give an explicit expression of its mean field bound and of the related improved
boun
Why are probabilistic laws governing quantum mechanics and neurobiology?
We address the question: Why are dynamical laws governing in quantum
mechanics and in neuroscience of probabilistic nature instead of being
deterministic? We discuss some ideas showing that the probabilistic option
offers advantages over the deterministic one.Comment: 40 pages, 8 fig
Geometry and Topology in Memory and Navigation
Okinawa Institute of Science and Technology Graduate UniversityDoctor of PhilosophyGeometry and topology offer rich mathematical worlds and perspectives with which to study and improve our understanding of cognitive function. Here I present the following examples: (1) a functional role for inhibitory diversity in associative memories with graph- ical relationships; (2) improved memory capacity in an associative memory model with setwise connectivity, with implications for glial and dendritic function; (3) safe and effi- cient group navigation among conspecifics using purely local geometric information; and (4) enhancing geometric and topological methods to probe the relations between neural activity and behaviour. In each work, tools and insights from geometry and topology are used in essential ways to gain improved insights or performance. This thesis contributes to our knowledge of the potential computational affordances of biological mechanisms (such as inhibition and setwise connectivity), while also demonstrating new geometric and topological methods and perspectives with which to deepen our understanding of cognitive tasks and their neural representations.doctoral thesi
- …