1,929 research outputs found
Semi-proximal Mirror-Prox for Nonsmooth Composite Minimization
We propose a new first-order optimisation algorithm to solve high-dimensional
non-smooth composite minimisation problems. Typical examples of such problems
have an objective that decomposes into a non-smooth empirical risk part and a
non-smooth regularisation penalty. The proposed algorithm, called Semi-Proximal
Mirror-Prox, leverages the Fenchel-type representation of one part of the
objective while handling the other part of the objective via linear
minimization over the domain. The algorithm stands in contrast with more
classical proximal gradient algorithms with smoothing, which require the
computation of proximal operators at each iteration and can therefore be
impractical for high-dimensional problems. We establish the theoretical
convergence rate of Semi-Proximal Mirror-Prox, which exhibits the optimal
complexity bounds, i.e. , for the number of calls to linear
minimization oracle. We present promising experimental results showing the
interest of the approach in comparison to competing methods
Exact block-wise optimization in group lasso and sparse group lasso for linear regression
The group lasso is a penalized regression method, used in regression problems
where the covariates are partitioned into groups to promote sparsity at the
group level. Existing methods for finding the group lasso estimator either use
gradient projection methods to update the entire coefficient vector
simultaneously at each step, or update one group of coefficients at a time
using an inexact line search to approximate the optimal value for the group of
coefficients when all other groups' coefficients are fixed. We present a new
method of computation for the group lasso in the linear regression case, the
Single Line Search (SLS) algorithm, which operates by computing the exact
optimal value for each group (when all other coefficients are fixed) with one
univariate line search. We perform simulations demonstrating that the SLS
algorithm is often more efficient than existing computational methods. We also
extend the SLS algorithm to the sparse group lasso problem via the Signed
Single Line Search (SSLS) algorithm, and give theoretical results to support
both algorithms.Comment: We have been made aware of the earlier work by Puig et al. (2009)
which derives the same result for the (non-sparse) group lasso setting. We
leave this manuscript available as a technical report, to serve as a
reference for the previously untreated sparse group lasso case, and for
timing comparisons of various methods in the group lasso setting. The
manuscript is updated to include this referenc
- …