1 research outputs found

    The Stochastic Fej\'er-Monotone Hybrid Steepest Descent Method and the Hierarchical RLS

    Full text link
    This paper introduces the stochastic Fej\'{e}r-monotone hybrid steepest descent method (S-FM-HSDM) to solve affinely constrained and composite convex minimization tasks. The minimization task is not known exactly; noise contaminates the information about the composite loss function and the affine constraints. S-FM-HSDM generates sequences of random variables that, under certain conditions and with respect to a probability space, converge point-wise to solutions of the noiseless minimization task. S-FM-HSDM enjoys desirable attributes of optimization techniques such as splitting of variables and constant step size (learning rate). Furthermore, it provides a novel way of exploiting the information about the affine constraints via fixed-point sets of appropriate nonexpansive mappings. Among the offsprings of S-FM-HSDM, the hierarchical recursive least squares (HRLS) takes advantage of S-FM-HSDM's versatility toward affine constraints and offers a novel twist to LS by generating sequences of estimates that converge to solutions of a hierarchical optimization task: Minimize a convex loss over the set of minimizers of the ensemble LS loss. Numerical tests on a sparsity-aware LS task show that HRLS compares favorably to several state-of-the-art convex, as well as non-convex, stochastic-approximation and online-learning counterparts.Comment: To appear in IEEE Transactions on Signal Processin
    corecore