192 research outputs found

    Frank-Wolfe-type methods for a class of nonconvex inequality-constrained problems

    Full text link
    The Frank-Wolfe (FW) method, which implements efficient linear oracles that minimize linear approximations of the objective function over a fixed compact convex set, has recently received much attention in the optimization and machine learning literature. In this paper, we propose a new FW-type method for minimizing a smooth function over a compact set defined as the level set of a single difference-of-convex function, based on new generalized linear-optimization oracles (LO). We show that these LOs can be computed efficiently with closed-form solutions in some important optimization models that arise in compressed sensing and machine learning. In addition, under a mild strict feasibility condition, we establish the subsequential convergence of our nonconvex FW-type method. Since the feasible region of our generalized LO typically changes from iteration to iteration, our convergence analysis is completely different from those existing works in the literature on FW-type methods that deal with fixed feasible regions among subproblems. Finally, motivated by the away steps for accelerating FW-type methods for convex problems, we further design an away-step oracle to supplement our nonconvex FW-type method, and establish subsequential convergence of this variant. Numerical results on the matrix completion problem with standard datasets are presented to demonstrate the efficiency of the proposed FW-type method and its away-step variant.Comment: We updated grant information and fixed some minor typos in Section
    • …
    corecore