180,314 research outputs found

    Switched-capacitor neural networks for linear programming

    Get PDF
    A circuit for online solving of linear programming problems is presented. The circuit uses switched-capacitor techniques and is thus suitable for monolithic implementation. The connection of the proposed circuit to analogue neural networks is also outlined.Comisión Interministerial de Ciencia y Tecnología ME87-000

    Hierarchical Self-Programming in Recurrent Neural Networks

    Full text link
    We study self-programming in recurrent neural networks where both neurons (the `processors') and synaptic interactions (`the programme') evolve in time simultaneously, according to specific coupled stochastic equations. The interactions are divided into a hierarchy of LL groups with adiabatically separated and monotonically increasing time-scales, representing sub-routines of the system programme of decreasing volatility. We solve this model in equilibrium, assuming ergodicity at every level, and find as our replica-symmetric solution a formalism with a structure similar but not identical to Parisi's LL-step replica symmetry breaking scheme. Apart from differences in details of the equations (due to the fact that here interactions, rather than spins, are grouped into clusters with different time-scales), in the present model the block sizes mim_i of the emerging ultrametric solution are not restricted to the interval [0,1][0,1], but are independent control parameters, defined in terms of the noise strengths of the various levels in the hierarchy, which can take any value in [0,\infty\ket. This is shown to lead to extremely rich phase diagrams, with an abundance of first-order transitions especially when the level of stochasticity in the interaction dynamics is chosen to be low.Comment: 53 pages, 19 figures. Submitted to J. Phys.
    corecore