44,156 research outputs found

    Improved guarantees for optimal Nash equilibrium seeking and bilevel variational inequalities

    Full text link
    We consider a class of hierarchical variational inequality (VI) problems that subsumes VI-constrained optimization and several other important problem classes including the optimal solution selection problem, the optimal Nash equilibrium (NE) seeking problem, and the generalized NE seeking problem. Our main contributions are threefold. (i) We consider bilevel VIs with merely monotone and Lipschitz continuous mappings and devise a single-timescale iteratively regularized extragradient method (IR-EG). We improve the existing iteration complexity results for addressing both bilevel VI and VI-constrained convex optimization problems. (ii) Under the strong monotonicity of the outer level mapping, we develop a variant of IR-EG, called R-EG, and derive significantly faster guarantees than those in (i). These results appear to be new for both bilevel VIs and VI-constrained optimization. (iii) To our knowledge, complexity guarantees for computing the optimal NE in nonconvex settings do not exist. Motivated by this lacuna, we consider VI-constrained nonconvex optimization problems and devise an inexactly-projected gradient method, called IPR-EG, where the projection onto the unknown set of equilibria is performed using R-EG with prescribed adaptive termination criterion and regularization parameters. We obtain new complexity guarantees in terms of a residual map and an infeasibility metric for computing a stationary point. We validate the theoretical findings using preliminary numerical experiments for computing the best and the worst Nash equilibria

    Adaptive Preconditioned Gradient Descent with Energy

    Full text link
    We propose an adaptive time step with energy for a large class of preconditioned gradient descent methods, mainly applied to constrained optimization problems. Our strategy relies on representing the usual descent direction by the product of an energy variable and a transformed gradient, with a preconditioning matrix, for example, to reflect the natural gradient induced by the underlying metric in parameter space or to endow a projection operator when linear equality constraints are present. We present theoretical results on both unconditional stability and convergence rates for three respective classes of objective functions. In addition, our numerical results shed light on the excellent performance of the proposed method on several benchmark optimization problems.Comment: 32 pages, 3 figure

    Manifold Optimization Over the Set of Doubly Stochastic Matrices: A Second-Order Geometry

    Get PDF
    Convex optimization is a well-established research area with applications in almost all fields. Over the decades, multiple approaches have been proposed to solve convex programs. The development of interior-point methods allowed solving a more general set of convex programs known as semi-definite programs and second-order cone programs. However, it has been established that these methods are excessively slow for high dimensions, i.e., they suffer from the curse of dimensionality. On the other hand, optimization algorithms on manifold have shown great ability in finding solutions to nonconvex problems in reasonable time. This paper is interested in solving a subset of convex optimization using a different approach. The main idea behind Riemannian optimization is to view the constrained optimization problem as an unconstrained one over a restricted search space. The paper introduces three manifolds to solve convex programs under particular box constraints. The manifolds, called the doubly stochastic, symmetric and the definite multinomial manifolds, generalize the simplex also known as the multinomial manifold. The proposed manifolds and algorithms are well-adapted to solving convex programs in which the variable of interest is a multidimensional probability distribution function. Theoretical analysis and simulation results testify the efficiency of the proposed method over state of the art methods. In particular, they reveal that the proposed framework outperforms conventional generic and specialized solvers, especially in high dimensions
    • …
    corecore