440 research outputs found
Convex Optimization Methods for Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression
In this paper, we study convex optimization methods for computing the trace
norm regularized least squares estimate in multivariate linear regression. The
so-called factor estimation and selection (FES) method, recently proposed by
Yuan et al. [22], conducts parameter estimation and factor selection
simultaneously and have been shown to enjoy nice properties in both large and
finite samples. To compute the estimates, however, can be very challenging in
practice because of the high dimensionality and the trace norm constraint. In
this paper, we explore a variant of Nesterov's smooth method [20] and interior
point methods for computing the penalized least squares estimate. The
performance of these methods is then compared using a set of randomly generated
instances. We show that the variant of Nesterov's smooth method [20] generally
outperforms the interior point method implemented in SDPT3 version 4.0 (beta)
[19] substantially . Moreover, the former method is much more memory efficient.Comment: 27 page
Accelerated Inexact Composite Gradient Methods for Nonconvex Spectral Optimization Problems
This paper presents two inexact composite gradient methods, one inner
accelerated and another doubly accelerated, for solving a class of nonconvex
spectral composite optimization problems. More specifically, the objective
function for these problems is of the form where and
are differentiable nonconvex matrix functions with Lipschitz continuous
gradients, is a proper closed convex matrix function, and both and
can be expressed as functions that operate on the singular values of their
inputs. The methods essentially use an accelerated composite gradient method to
solve a sequence of proximal subproblems involving the linear approximation of
and the singular value functions underlying and . Unlike other
composite gradient-based methods, the proposed methods take advantage of both
the composite and spectral structure underlying the objective function in order
to efficiently generate their solutions. Numerical experiments are presented to
demonstrate the practicality of these methods on a set of real-world and
randomly generated spectral optimization problems
A unified analysis of a class of proximal bundle methods for solving hybrid convex composite optimization problems
This paper presents a proximal bundle (PB) framework based on a generic
bundle update scheme for solving the hybrid convex composite optimization
(HCCO) problem and establishes a common iteration-complexity bound for any
variant belonging to it. As a consequence, iteration-complexity bounds for
three PB variants based on different bundle update schemes are obtained in the
HCCO context for the first time and in a unified manner. While two of the PB
variants are universal (i.e., their implementations do not require parameters
associated with the HCCO instance), the other newly (as far as the authors are
aware of) proposed one is not but has the advantage that it generates simple,
namely one-cut, bundle models. The paper also presents a universal adaptive PB
variant (which is not necessarily an instance of the framework) based on
one-cut models and shows that its iteration-complexity is the same as the two
aforementioned universal PB variants.Comment: 31 page
- β¦