258 research outputs found

    Quantum approximate optimization algorithm applied to the binary perceptron

    Get PDF
    We apply digitized Quantum Annealing (QA) and Quantum Approximate Optimization Algorithm (QAOA) to a paradigmatic task of supervised learning in artificial neural networks: the optimization of synaptic weights for the binary perceptron. At variance with the usual QAOA applications to MaxCut, or to quantum spin-chains ground state preparation, the classical is characterized by highly non-local multi-spin interactions. Yet, we provide evidence for the existence of optimal solutions for the QAOA parameters, which are among typical instances of the same problem, and we prove numerically an enhanced performance of QAOA over traditional QA. We also investigate on the role of the landscape geometry in this problem. \revision{By artificially breaking this geometrical structure, we show that the detrimental effect of a gap-closing transition, encountered in QA, is also negatively affecting the performance of our QAOA implementation
    corecore