In this study, we propose a global optimization algorithm based on quantizing
the energy level of an objective function in an NP-hard problem. According to
the white noise hypothesis for a quantization error with a dense and uniform
distribution, we can regard the quantization error as i.i.d. white noise. From
stochastic analysis, the proposed algorithm converges weakly only under
conditions satisfying Lipschitz continuity, instead of local convergence
properties such as the Hessian constraint of the objective function. This shows
that the proposed algorithm ensures global optimization by Laplace's condition.
Numerical experiments show that the proposed algorithm outperforms conventional
learning methods in solving NP-hard optimization problems such as the traveling
salesman problem.Comment: 25 pages, 3 figures, NeurIPS 2022 workshop OPT 2022 (14th Annual
Workshop on Optimization for Machine Learning