12,617 research outputs found

    Unipolar terminal-attractor-based neural associative memory with adaptive threshold and perfect convergence

    Get PDF
    A perfectly convergent unipolar neural associative-memory system based on nonlinear dynamical terminal attractors is presented. With adaptive setting of the threshold values for the dynamic iteration for the unipolar binary neuron states with terminal attractors, perfect convergence is achieved. This achievement and correct retrieval are demonstrated by computer simulation. The simulations are completed (1) by exhaustive tests with all of the possible combinations of stored and test vectors in small-scale networks and (2) by Monte Carlo simulations with randomly generated stored and test vectors in large-scale networks with an M/N ratio of 4 (M is the number of stored vectors; N is the number of neurons < 256). An experiment with exclusive-oR logic operations with liquid-crystal-television spatial light modulators is used to show the feasibility of an optoelectronic implementation of the model. The behavior of terminal attractors in basins of energy space is illustrated by examples

    NASA JSC neural network survey results

    Get PDF
    A survey of Artificial Neural Systems in support of NASA's (Johnson Space Center) Automatic Perception for Mission Planning and Flight Control Research Program was conducted. Several of the world's leading researchers contributed papers containing their most recent results on artificial neural systems. These papers were broken into categories and descriptive accounts of the results make up a large part of this report. Also included is material on sources of information on artificial neural systems such as books, technical reports, software tools, etc

    Real time unsupervised learning of visual stimuli in neuromorphic VLSI systems

    Full text link
    Neuromorphic chips embody computational principles operating in the nervous system, into microelectronic devices. In this domain it is important to identify computational primitives that theory and experiments suggest as generic and reusable cognitive elements. One such element is provided by attractor dynamics in recurrent networks. Point attractors are equilibrium states of the dynamics (up to fluctuations), determined by the synaptic structure of the network; a `basin' of attraction comprises all initial states leading to a given attractor upon relaxation, hence making attractor dynamics suitable to implement robust associative memory. The initial network state is dictated by the stimulus, and relaxation to the attractor state implements the retrieval of the corresponding memorized prototypical pattern. In a previous work we demonstrated that a neuromorphic recurrent network of spiking neurons and suitably chosen, fixed synapses supports attractor dynamics. Here we focus on learning: activating on-chip synaptic plasticity and using a theory-driven strategy for choosing network parameters, we show that autonomous learning, following repeated presentation of simple visual stimuli, shapes a synaptic connectivity supporting stimulus-selective attractors. Associative memory develops on chip as the result of the coupled stimulus-driven neural activity and ensuing synaptic dynamics, with no artificial separation between learning and retrieval phases.Comment: submitted to Scientific Repor

    Parallel processing in immune networks

    Full text link
    In this work we adopt a statistical mechanics approach to investigate basic, systemic features exhibited by adaptive immune systems. The lymphocyte network made by B-cells and T-cells is modeled by a bipartite spin-glass, where, following biological prescriptions, links connecting B-cells and T-cells are sparse. Interestingly, the dilution performed on links is shown to make the system able to orchestrate parallel strategies to fight several pathogens at the same time; this multitasking capability constitutes a remarkable, key property of immune systems as multiple antigens are always present within the host. We also define the stochastic process ruling the temporal evolution of lymphocyte activity, and show its relaxation toward an equilibrium measure allowing statistical mechanics investigations. Analytical results are compared with Monte Carlo simulations and signal-to-noise outcomes showing overall excellent agreement. Finally, within our model, a rationale for the experimentally well-evidenced correlation between lymphocytosis and autoimmunity is achieved; this sheds further light on the systemic features exhibited by immune networks.Comment: 21 pages, 9 figures; to appear in Phys. Rev.

    Intelligent search for distributed information sources using heterogeneous neural networks

    Get PDF
    As the number and diversity of distributed information sources on the Internet exponentially increase, various search services are developed to help the users to locate relevant information. But they still exist some drawbacks such as the difficulty of mathematically modeling retrieval process, the lack of adaptivity and the indiscrimination of search. This paper shows how heteroge-neous neural networks can be used in the design of an intelligent distributed in-formation retrieval (DIR) system. In particular, three typical neural network models - Kohoren's SOFM Network, Hopfield Network, and Feed Forward Network with Back Propagation algorithm are introduced to overcome the above drawbacks in current research of DIR by using their unique properties. This preliminary investigation suggests that Neural Networks are useful tools for intelligent search for distributed information sources

    Anticipatory Semantic Processes

    Get PDF
    Why anticipatory processes correspond to cognitive abilities of living systems? To be adapted to an environment, behaviors need at least i) internal representations of events occurring in the external environment; and ii) internal anticipations of possible events to occur in the external environment. Interactions of these two opposite but complementary cognitive properties lead to various patterns of experimental data on semantic processing. How to investigate dynamic semantic processes? Experimental studies in cognitive psychology offer several interests such as: i) the control of the semantic environment such as words embedded in sentences; ii) the methodological tools allowing the observation of anticipations and adapted oculomotor behavior during reading; and iii) the analyze of different anticipatory processes within the theoretical framework of semantic processing. What are the different types of semantic anticipations? Experimental data show that semantic anticipatory processes involve i) the coding in memory of sequences of words occurring in textual environments; ii) the anticipation of possible future words from currently perceived words; and iii) the selection of anticipated words as a function of the sequences of perceived words, achieved by anticipatory activations and inhibitory selection processes. How to modelize anticipatory semantic processes? Localist or distributed neural networks models can account for some types of semantic processes, anticipatory or not. Attractor neural networks coding temporal sequences are presented as good candidate for modeling anticipatory semantic processes, according to specific properties of the human brain such as i) auto-associative memory; ii) learning and memorization of sequences of patterns; and iii) anticipation of memorized patterns from previously perceived patterns
    • …
    corecore