19 research outputs found

    An accelerated first-order method with complexity analysis for solving cubic regularization subproblems

    Full text link
    We propose a first-order method to solve the cubic regularization subproblem (CRS) based on a novel reformulation. The reformulation is a constrained convex optimization problem whose feasible region admits an easily computable projection. Our reformulation requires computing the minimum eigenvalue of the Hessian. To avoid the expensive computation of the exact minimum eigenvalue, we develop a surrogate problem to the reformulation where the exact minimum eigenvalue is replaced with an approximate one. We then apply first-order methods such as the Nesterov's accelerated projected gradient method (APG) and projected Barzilai-Borwein method to solve the surrogate problem. As our main theoretical contribution, we show that when an ϵ\epsilon-approximate minimum eigenvalue is computed by the Lanczos method and the surrogate problem is approximately solved by APG, our approach returns an ϵ\epsilon-approximate solution to CRS in O~(ϵ1/2)\tilde O(\epsilon^{-1/2}) matrix-vector multiplications (where O~()\tilde O(\cdot) hides the logarithmic factors). Numerical experiments show that our methods are comparable to and outperform the Krylov subspace method in the easy and hard cases, respectively. We further implement our methods as subproblem solvers of adaptive cubic regularization methods, and numerical results show that our algorithms are comparable to the state-of-the-art algorithms

    An Inexact Augmented Lagrangian Method for Second-order Cone Programming with Applications

    Full text link
    In this paper, we adopt the augmented Lagrangian method (ALM) to solve convex quadratic second-order cone programming problems (SOCPs). Fruitful results on the efficiency of the ALM have been established in the literature. Recently, it has been shown in [Cui, Sun, and Toh, {\em Math. Program.}, 178 (2019), pp. 381--415] that if the quadratic growth condition holds at an optimal solution for the dual problem, then the KKT residual converges to zero R-superlinearly when the ALM is applied to the primal problem. Moreover, Cui, Ding, and Zhao [{\em SIAM J. Optim.}, 27 (2017), pp. 2332-2355] provided sufficient conditions for the quadratic growth condition to hold under the metric subregularity and bounded linear regularity conditions for solving composite matrix optimization problems involving spectral functions. Here, we adopt these recent ideas to analyze the convergence properties of the ALM when applied to SOCPs. To the best of our knowledge, no similar work has been done for SOCPs so far. In our paper, we first provide sufficient conditions to ensure the quadratic growth condition for SOCPs. With these elegant theoretical guarantees, we then design an SOCP solver and apply it to solve various classes of SOCPs, such as minimal enclosing ball problems, classical trust-region subproblems, square-root Lasso problems, and DIMACS Challenge problems. Numerical results show that the proposed ALM based solver is efficient and robust compared to the existing highly developed solvers, such as Mosek and SDPT3.Comment: 25 pages, 0 figur
    corecore