2,520 research outputs found

    Attractor Dynamics in Feedforward Neural Networks

    Get PDF
    We study the probabilistic generative models parameterized by feedforward neural networks. An attractor dynamics for probabilistic inference in these models is derived from a mean field approximation for large, layered sigmoidal networks. Fixed points of the dynamics correspond to solutions of the mean field equations, which relate the statistics of each unit to those of its Markov blanket. We establish global convergence of the dynamics by providing a Lyapunov function and show that the dynamics generate the signals required for unsupervised learning. Our results for feedforward networks provide a counterpart to those of Cohen-Grossberg and Hopfield for symmetric networks. 1 Introduction Attractor neural networks lend a computational purpose to continuous dynamical systems. Celebrated uses of these networks include the storage of associative memories (Amit, 1989), the reconstruction of noisy images (Koch et al, 1986), and the search for shortest paths in the traveling salesman proble..

    Forecasting high waters at Venice Lagoon using chaotic time series analisys and nonlinear neural netwoks

    Get PDF
    Time series analysis using nonlinear dynamics systems theory and multilayer neural networks models have been applied to the time sequence of water level data recorded every hour at 'Punta della Salute' from Venice Lagoon during the years 1980-1994. The first method is based on the reconstruction of the state space attractor using time delay embedding vectors and on the characterisation of invariant properties which define its dynamics. The results suggest the existence of a low dimensional chaotic attractor with a Lyapunov dimension, DL, of around 6.6 and a predictability between 8 and 13 hours ahead. Furthermore, once the attractor has been reconstructed it is possible to make predictions by mapping local-neighbourhood to local-neighbourhood in the reconstructed phase space. To compare the prediction results with another nonlinear method, two nonlinear autoregressive models (NAR) based on multilayer feedforward neural networks have been developed. From the study, it can be observed that nonlinear forecasting produces adequate results for the 'normal' dynamic behaviour of the water level of Venice Lagoon, outperforming linear algorithms, however, both methods fail to forecast the 'high water' phenomenon more than 2-3 hours ahead.Publicad

    Neural implementation of psychological spaces

    Get PDF
    Psychological spaces give natural framework for construction of mental representations. Neural model of psychological spaces provides a link between neuroscience and psychology. Categorization performed in high-dimensional spaces by dynamical associative memory models is approximated with low-dimensional feedforward neural models calculating probability density functions in psychological spaces. Applications to the human categorization experiments are discussed

    Six networks on a universal neuromorphic computing substrate

    Get PDF
    In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality
    corecore