1,399 research outputs found

    Neural Networks retrieving Boolean patterns in a sea of Gaussian ones

    Full text link
    Restricted Boltzmann Machines are key tools in Machine Learning and are described by the energy function of bipartite spin-glasses. From a statistical mechanical perspective, they share the same Gibbs measure of Hopfield networks for associative memory. In this equivalence, weights in the former play as patterns in the latter. As Boltzmann machines usually require real weights to be trained with gradient descent like methods, while Hopfield networks typically store binary patterns to be able to retrieve, the investigation of a mixed Hebbian network, equipped with both real (e.g., Gaussian) and discrete (e.g., Boolean) patterns naturally arises. We prove that, in the challenging regime of a high storage of real patterns, where retrieval is forbidden, an extra load of Boolean patterns can still be retrieved, as long as the ratio among the overall load and the network size does not exceed a critical threshold, that turns out to be the same of the standard Amit-Gutfreund-Sompolinsky theory. Assuming replica symmetry, we study the case of a low load of Boolean patterns combining the stochastic stability and Hamilton-Jacobi interpolating techniques. The result can be extended to the high load by a non rigorous but standard replica computation argument.Comment: 16 pages, 1 figur

    A modular T-mode design approach for analog neural network hardware implementations

    Get PDF
    A modular transconductance-mode (T-mode) design approach is presented for analog hardware implementations of neural networks. This design approach is used to build a modular bidirectional associative memory network. The authors show that the size of the whole system can be increased by interconnecting more modular chips. It is also shown that by changing the interconnection strategy different neural network systems can be implemented, such as a Hopfield network, a winner-take-all network, a simplified ART1 network, or a constrained optimization network. Experimentally measured results from CMOS 2-μm double-metal, double-polysilicon prototypes (MOSIS) are presented

    An analog feedback associative memory

    Get PDF
    A method for the storage of analog vectors, i.e., vectors whose components are real-valued, is developed for the Hopfield continuous-time network. An important requirement is that each memory vector has to be an asymptotically stable (i.e. attractive) equilibrium of the network. Some of the limitations imposed by the continuous Hopfield model on the set of vectors that can be stored are pointed out. These limitations can be relieved by choosing a network containing visible as well as hidden units. An architecture consisting of several hidden layers and a visible layer, connected in a circular fashion, is considered. It is proved that the two-layer case is guaranteed to store any number of given analog vectors provided their number does not exceed 1 + the number of neurons in the hidden layer. A learning algorithm that correctly adjusts the locations of the equilibria and guarantees their asymptotic stability is developed. Simulation results confirm the effectiveness of the approach

    Analog integrated neural-like circuits for nonlinear programming

    Get PDF
    A systematic approach for the design of analog neural nonlinear programming solvers using switched-capacitor (SC) integrated circuit techniques is presented. The method is based on formulating a dynamic gradient system whose state evolves in time towards the solution point of the corresponding programming problem. A neuron cell for the linear and the quadratic problem suitable for monolithic implementation is introduced. The design of this neuron and its corresponding synapses using SC techniques is considered in detail. An SC circuit architecture based on a reduced set of basic building blocks with high modularity is presented. Simulation results using a mixed-mode simulator (DIANA) and experimental results from breadboard prototypes are included, illustrating the validity of the proposed technique

    Disappearance of Spurious States in Analog Associative Memories

    Full text link
    We show that symmetric n-mixture states, when they exist, are almost never stable in autoassociative networks with threshold-linear units. Only with a binary coding scheme we could find a limited region of the parameter space in which either 2-mixtures or 3-mixtures are stable attractors of the dynamics.Comment: 5 pages, 3 figures, accepted for publication in Phys Rev

    NASA JSC neural network survey results

    Get PDF
    A survey of Artificial Neural Systems in support of NASA's (Johnson Space Center) Automatic Perception for Mission Planning and Flight Control Research Program was conducted. Several of the world's leading researchers contributed papers containing their most recent results on artificial neural systems. These papers were broken into categories and descriptive accounts of the results make up a large part of this report. Also included is material on sources of information on artificial neural systems such as books, technical reports, software tools, etc
    corecore