41 research outputs found

    Model Consistency of Partly Smooth Regularizers

    Full text link
    This paper studies least-square regression penalized with partly smooth convex regularizers. This class of functions is very large and versatile allowing to promote solutions conforming to some notion of low-complexity. Indeed, they force solutions of variational problems to belong to a low-dimensional manifold (the so-called model) which is stable under small perturbations of the function. This property is crucial to make the underlying low-complexity model robust to small noise. We show that a generalized "irrepresentable condition" implies stable model selection under small noise perturbations in the observations and the design matrix, when the regularization parameter is tuned proportionally to the noise level. This condition is shown to be almost a necessary condition. We then show that this condition implies model consistency of the regularized estimator. That is, with a probability tending to one as the number of measurements increases, the regularized estimator belongs to the correct low-dimensional model manifold. This work unifies and generalizes several previous ones, where model consistency is known to hold for sparse, group sparse, total variation and low-rank regularizations

    Generalized Hadamard Product and the Derivatives of Spectral Functions

    Full text link
    In this work we propose a generalization of the Hadamard product between two matrices to a tensor-valued, multi-linear product between k matrices for any k≄1k \ge 1. A multi-linear dual operator to the generalized Hadamard product is presented. It is a natural generalization of the Diag x operator, that maps a vector x∈Rnx \in \R^n into the diagonal matrix with x on its main diagonal. Defining an action of the n×nn \times n orthogonal matrices on the space of k-dimensional tensors, we investigate its interactions with the generalized Hadamard product and its dual. The research is motivated, as illustrated throughout the paper, by the apparent suitability of this language to describe the higher-order derivatives of spectral functions and the tools needed to compute them. For more on the later we refer the reader to [14] and [15], where we use the language and properties developed here to study the higher-order derivatives of spectral functions.Comment: 24 page

    A feasible smoothing accelerated projected gradient method for nonsmooth convex optimization

    Full text link
    Smoothing accelerated gradient methods achieve faster convergence rates than that of the subgradient method for some nonsmooth convex optimization problems. However, Nesterov's extrapolation may require gradients at infeasible points, and thus they cannot be applied to some structural optimization problems. We introduce a variant of smoothing accelerated projected gradient methods where every variable is feasible. The O(k−1log⁡k)O(k^{-1}\log k) convergence rate is obtained using the Lyapunov function. We conduct a numerical experiment on the robust compliance optimization of a truss structure.Comment: 6 pages, 2 figure
    corecore