A Note on Zeroth-Order Optimization on the Simplex

Abstract

We construct a zeroth-order gradient estimator for a smooth function defined on the probability simplex. The proposed estimator queries the simplex only. We prove that projected gradient descent and the exponential weights algorithm, when run with this estimator instead of exact gradients, converge at a O(Tβˆ’1/4)\mathcal O(T^{-1/4}) rate

    Similar works

    Full text

    thumbnail-image

    Available Versions