2 research outputs found

    Projection-Free Methods for Stochastic Simple Bilevel Optimization with Convex Lower-level Problem

    Full text link
    In this paper, we study a class of stochastic bilevel optimization problems, also known as stochastic simple bilevel optimization, where we minimize a smooth stochastic objective function over the optimal solution set of another stochastic convex optimization problem. We introduce novel stochastic bilevel optimization methods that locally approximate the solution set of the lower-level problem via a stochastic cutting plane, and then run a conditional gradient update with variance reduction techniques to control the error induced by using stochastic gradients. For the case that the upper-level function is convex, our method requires O~(max{1/ϵf2,1/ϵg2})\tilde{\mathcal{O}}(\max\{1/\epsilon_f^{2},1/\epsilon_g^{2}\}) stochastic oracle queries to obtain a solution that is ϵf\epsilon_f-optimal for the upper-level and ϵg\epsilon_g-optimal for the lower-level. This guarantee improves the previous best-known complexity of O(max{1/ϵf4,1/ϵg4})\mathcal{O}(\max\{1/\epsilon_f^{4},1/\epsilon_g^{4}\}). Moreover, for the case that the upper-level function is non-convex, our method requires at most O~(max{1/ϵf3,1/ϵg3})\tilde{\mathcal{O}}(\max\{1/\epsilon_f^{3},1/\epsilon_g^{3}\}) stochastic oracle queries to find an (ϵf,ϵg)(\epsilon_f, \epsilon_g)-stationary point. In the finite-sum setting, we show that the number of stochastic oracle calls required by our method are O~(n/ϵ)\tilde{\mathcal{O}}(\sqrt{n}/\epsilon) and O~(n/ϵ2)\tilde{\mathcal{O}}(\sqrt{n}/\epsilon^{2}) for the convex and non-convex settings, respectively, where ϵ=min{ϵf,ϵg}\epsilon=\min \{\epsilon_f,\epsilon_g\}
    corecore