1,497 research outputs found

    Solution of Linear Programming Problems using a Neural Network with Non-Linear Feedback

    Get PDF
    This paper presents a recurrent neural circuit for solving linear programming problems. The objective is to minimize a linear cost function subject to linear constraints. The proposed circuit employs non-linear feedback, in the form of unipolar comparators, to introduce transcendental terms in the energy function ensuring fast convergence to the solution. The proof of validity of the energy function is also provided. The hardware complexity of the proposed circuit compares favorably with other proposed circuits for the same task. PSPICE simulation results are presented for a chosen optimization problem and are found to agree with the algebraic solution. Hardware test results for a 2–variable problem further serve to strengthen the proposed theory

    Hopf Bifurcation and Chaos in Tabu Learning Neuron Models

    Full text link
    In this paper, we consider the nonlinear dynamical behaviors of some tabu leaning neuron models. We first consider a tabu learning single neuron model. By choosing the memory decay rate as a bifurcation parameter, we prove that Hopf bifurcation occurs in the neuron. The stability of the bifurcating periodic solutions and the direction of the Hopf bifurcation are determined by applying the normal form theory. We give a numerical example to verify the theoretical analysis. Then, we demonstrate the chaotic behavior in such a neuron with sinusoidal external input, via computer simulations. Finally, we study the chaotic behaviors in tabu learning two-neuron models, with linear and quadratic proximity functions respectively.Comment: 14 pages, 13 figures, Accepted by International Journal of Bifurcation and Chao

    Neural networks, error-correcting codes, and polynomials over the binary n-cube

    Get PDF
    Several ways of relating the concept of error-correcting codes to the concept of neural networks are presented. Performing maximum-likelihood decoding in a linear block error-correcting code is shown to be equivalent to finding a global maximum of the energy function of a certain neural network. Given a linear block code, a neural network can be constructed in such a way that every codeword corresponds to a local maximum. The connection between maximization of polynomials over the n-cube and error-correcting codes is also investigated; the results suggest that decoding techniques can be a useful tool for solving such maximization problems. The results are generalized to both nonbinary and nonlinear codes
    corecore