155 research outputs found
Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines
Recent studies have shown that synaptic unreliability is a robust and
sufficient mechanism for inducing the stochasticity observed in cortex. Here,
we introduce Synaptic Sampling Machines, a class of neural network models that
uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised
learning. Similar to the original formulation of Boltzmann machines, these
models can be viewed as a stochastic counterpart of Hopfield networks, but
where stochasticity is induced by a random mask over the connections. Synaptic
stochasticity plays the dual role of an efficient mechanism for sampling, and a
regularizer during learning akin to DropConnect. A local synaptic plasticity
rule implementing an event-driven form of contrastive divergence enables the
learning of generative models in an on-line fashion. Synaptic sampling machines
perform equally well using discrete-timed artificial units (as in Hopfield
networks) or continuous-timed leaky integrate & fire neurons. The learned
representations are remarkably sparse and robust to reductions in bit precision
and synapse pruning: removal of more than 75% of the weakest connections
followed by cursory re-learning causes a negligible performance loss on
benchmark classification tasks. The spiking neuron-based synaptic sampling
machines outperform existing spike-based unsupervised learners, while
potentially offering substantial advantages in terms of power and complexity,
and are thus promising models for on-line learning in brain-inspired hardware
Trajectory prediction of moving objects by means of neural networks
Thesis (Master)--Izmir Institute of Technology, Computer Engineering, Izmir, 1997Includes bibliographical references (leaves: 103-105)Text in English; Abstract: Turkish and Englishviii, 105 leavesEstimating the three-dimensional motion of an object from a sequence of object positions and orientation is of significant importance in variety of applications in control and robotics. For instance, autonomous navigation, manipulation, servo, tracking, planning and surveillance needs prediction of motion parameters. Although "motion estimation" is an old problem (the formulations date back to the beginning of the century), only recently scientists have provided with the tools from nonlinear system estimation theory to solve this problem eural Networks are the ones which have recently been used in many nonlinear dynamic system parameter estimation context. The approximating ability of the neural network is used to identifY the relation between system variables and parameters of a dynamic system. The position, velocity and acceleration of the object are estimated by several neural networks using the II most recent measurements of the object coordinates as input to the system Several neural network topologies with different configurations are introduced and utilized in the solution of the problem. Training schemes for each configuration are given in detail. Simulation results for prediction of motion having different characteristics via different architectures with alternative configurations are presented comparatively
A Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines
Restricted Boltzmann machines (RBMs) are energy-based neural-networks which
are commonly used as the building blocks for deep architectures neural
architectures. In this work, we derive a deterministic framework for the
training, evaluation, and use of RBMs based upon the Thouless-Anderson-Palmer
(TAP) mean-field approximation of widely-connected systems with weak
interactions coming from spin-glass theory. While the TAP approach has been
extensively studied for fully-visible binary spin systems, our construction is
generalized to latent-variable models, as well as to arbitrarily distributed
real-valued spin systems with bounded support. In our numerical experiments, we
demonstrate the effective deterministic training of our proposed models and are
able to show interesting features of unsupervised learning which could not be
directly observed with sampling. Additionally, we demonstrate how to utilize
our TAP-based framework for leveraging trained RBMs as joint priors in
denoising problems
Nonlinear dynamics of pattern recognition and optimization
We associate learning in living systems with the shaping of the velocity vector field of a dynamical system in response to external, generally random, stimuli. We consider various approaches to implement a system that is able to adapt the whole vector field, rather than just parts of it - a drawback of the most common current learning systems: artificial neural networks.
This leads us to propose the mathematical concept of self-shaping dynamical systems. To begin, there is an empty phase space with no attractors, and thus a zero velocity vector field. Upon receiving the random stimulus, the vector field deforms and eventually becomes smooth and deterministic, despite the random nature of the applied force, while the phase space develops various geometrical objects. We consider the simplest of these - gradient self-shaping systems, whose vector field is the gradient of some energy function, which under certain conditions develops into the multi-dimensional probability density distribution of the input.
We explain how self-shaping systems are relevant to artificial neural networks. Firstly, we show that they can potentially perform pattern recognition tasks typically implemented by Hopfield neural networks, but without any supervision and on-line, and without developing spurious minima in the phase space. Secondly, they can reconstruct the probability density distribution of input signals, like probabilistic neural networks, but without the need for new training patterns to have to enter the network as new hardware units. We therefore regard self-shaping systems as a generalisation of the neural network concept, achieved by abandoning the "rigid units - flexible couplings'' paradigm and making the vector field fully flexible and amenable to external force. It is not clear how such systems could be implemented in hardware, and so this new concept presents an engineering challenge. It could also become an alternative paradigm for the modelling of both living and learning systems.
Mathematically it is interesting to find how a self shaping system could develop non-trivial objects in the phase space such as periodic orbits or chaotic attractors. We investigate how a delayed vector field could form such objects. We show that this method produces chaos in a class systems which have very simple dynamics in the non-delayed case. We also demonstrate the coexistence of bounded and unbounded solutions dependent on the initial conditions and the value of the delay. Finally, we speculate about how such a method could be used in global optimization
- …