4 research outputs found

    Memristor-based hardware and algorithms for higher-order Hopfield optimization solver outperforming quadratic Ising machines

    Full text link
    Ising solvers offer a promising physics-based approach to tackle the challenging class of combinatorial optimization problems. However, typical solvers operate in a quadratic energy space, having only pair-wise coupling elements which already dominate area and energy. We show that such quadratization can cause severe problems: increased dimensionality, a rugged search landscape, and misalignment with the original objective function. Here, we design and quantify a higher-order Hopfield optimization solver, with 28nm CMOS technology and memristive couplings for lower area and energy computations. We combine algorithmic and circuit analysis to show quantitative advantages over quadratic Ising Machines (IM)s, yielding 48x and 72x reduction in time-to-solution (TTS) and energy-to-solution (ETS) respectively for Boolean satisfiability problems of 150 variables, with favorable scaling

    Current Optimized Coset Coding for Efficient RRAM Programming

    No full text

    Transient variability in SOI-based LIF Neuron and impact on unsupervised learning

    No full text
    Variability is an integral part of biology. A biological neural network performs efficiently despite variability and sometimes its performance is facilitated by the variability. Hence, the study of variability on its electronic analog is essential for constructing biomimetic neural networks. We have recently demonstrated a compact leaky integrate and fire (LIF) neuron on PD-silicon on insulator (SOI) MOSFET. In this paper, we have studied impact ionization (II)-induced variability both device-to-device (D2D) and cycle-to-cycle (C2C) in the SOI neuron. The C2C variability is attributed to the fluctuation in the II-generated charge storage and it is enhanced by at least 2.5x as compared to the no-II case. The D2D variability, on the other hand, is related to the II-induced sharp subthreshold slope (~ 40 mV/decade), which enhanced the variability by ~20x compared to the no-II case. The impact of the enhanced variability in SOI neurons on an unsupervised classification task was evaluated by simulating a spiking neural network (SNN) with both analog and binary synapses. For analog synapse-based SNN, the C2C variability improved the performance by ~ 5% relative to ideal LIF neurons. However, the D2D variability, as well as combined D2D and C2C variability, degrades learning by -~ 10%. For binary synapses, we observe that performance drastically degrades for ideal LIF neurons as the synaptic weight initialization becomes nonrandom. However, neurons with the experimentally demonstrated variability (C2C and D2D) mitigate this challenge. Therefore, this enables binary synapses to perform at par with analog synapses, which allows for deterministic weight initialization. This makes RNG circuits for random weight initialization redundant.by Sangya Dutta, Tinish Bhattacharya, Nihar R. Mohapatra, Manan Suri and Udayan Gangul
    corecore