1 research outputs found

    A Hybrid Neural Optimization Scheme Based on Parallel Updates

    No full text
    A synchronous Hopfield--type neural network model containing units with analog input and binary output, which is suitable for parallel implementation, is examined in the context of solving discrete optimization problems. A hybrid parallel update scheme concerning the stochastic input-output behaviour of each unit is presented. This parallel update scheme maintains the solution quality of the Boltzmann Machine optimizer, which is inherently sequential. Experimental results on the Maximum Independent Set problem demonstrate the benefit of using the proposed optimizer in terms of computation time. Excellent speedup has been obtained through parallel implementation on both shared memory and distributed memory architecures. Keywords: Optimization, parallel computing, Boltzmann Machine, Cauchy Machine. 1 Introduction The usual approach for solving a discrete optimization problem using neural network techniques, is to formulate the cost function and the constraints of the problem in terms of..
    corecore