2,128 research outputs found

    Sharp RIP Bound for Sparse Signal and Low-Rank Matrix Recovery

    Get PDF
    This paper establishes a sharp condition on the restricted isometry property (RIP) for both the sparse signal recovery and low-rank matrix recovery. It is shown that if the measurement matrix AA satisfies the RIP condition δkA<1/3\delta_k^A<1/3, then all kk-sparse signals β\beta can be recovered exactly via the constrained ℓ1\ell_1 minimization based on y=Aβy=A\beta. Similarly, if the linear map M\cal M satisfies the RIP condition δrM<1/3\delta_r^{\cal M}<1/3, then all matrices XX of rank at most rr can be recovered exactly via the constrained nuclear norm minimization based on b=M(X)b={\cal M}(X). Furthermore, in both cases it is not possible to do so in general when the condition does not hold. In addition, noisy cases are considered and oracle inequalities are given under the sharp RIP condition.Comment: to appear in Applied and Computational Harmonic Analysis (2012

    Sharp Oracle Inequalities for Square Root Regularization

    Full text link
    We study a set of regularization methods for high-dimensional linear regression models. These penalized estimators have the square root of the residual sum of squared errors as loss function, and any weakly decomposable norm as penalty function. This fit measure is chosen because of its property that the estimator does not depend on the unknown standard deviation of the noise. On the other hand, a generalized weakly decomposable norm penalty is very useful in being able to deal with different underlying sparsity structures. We can choose a different sparsity inducing norm depending on how we want to interpret the unknown parameter vector β\beta. Structured sparsity norms, as defined in Micchelli et al. [18], are special cases of weakly decomposable norms, therefore we also include the square root LASSO (Belloni et al. [3]), the group square root LASSO (Bunea et al. [10]) and a new method called the square root SLOPE (in a similar fashion to the SLOPE from Bogdan et al. [6]). For this collection of estimators our results provide sharp oracle inequalities with the Karush-Kuhn-Tucker conditions. We discuss some examples of estimators. Based on a simulation we illustrate some advantages of the square root SLOPE

    Link Prediction in Graphs with Autoregressive Features

    Full text link
    In the paper, we consider the problem of link prediction in time-evolving graphs. We assume that certain graph features, such as the node degree, follow a vector autoregressive (VAR) model and we propose to use this information to improve the accuracy of prediction. Our strategy involves a joint optimization procedure over the space of adjacency matrices and VAR matrices which takes into account both sparsity and low rank properties of the matrices. Oracle inequalities are derived and illustrate the trade-offs in the choice of smoothing parameters when modeling the joint effect of sparsity and low rank property. The estimate is computed efficiently using proximal methods through a generalized forward-backward agorithm.Comment: NIPS 201
    • …
    corecore