12,397 research outputs found

    Local Behavior of Sparse Analysis Regularization: Applications to Risk Estimation

    Full text link
    In this paper, we aim at recovering an unknown signal x0 from noisy L1measurements y=Phi*x0+w, where Phi is an ill-conditioned or singular linear operator and w accounts for some noise. To regularize such an ill-posed inverse problem, we impose an analysis sparsity prior. More precisely, the recovery is cast as a convex optimization program where the objective is the sum of a quadratic data fidelity term and a regularization term formed of the L1-norm of the correlations between the sought after signal and atoms in a given (generally overcomplete) dictionary. The L1-sparsity analysis prior is weighted by a regularization parameter lambda>0. In this paper, we prove that any minimizers of this problem is a piecewise-affine function of the observations y and the regularization parameter lambda. As a byproduct, we exploit these properties to get an objectively guided choice of lambda. In particular, we develop an extension of the Generalized Stein Unbiased Risk Estimator (GSURE) and show that it is an unbiased and reliable estimator of an appropriately defined risk. The latter encompasses special cases such as the prediction risk, the projection risk and the estimation risk. We apply these risk estimators to the special case of L1-sparsity analysis regularization. We also discuss implementation issues and propose fast algorithms to solve the L1 analysis minimization problem and to compute the associated GSURE. We finally illustrate the applicability of our framework to parameter(s) selection on several imaging problems

    Geometric Duality for Convex Vector Optimization Problems

    Full text link
    Geometric duality theory for multiple objective linear programming problems turned out to be very useful for the development of efficient algorithms to generate or approximate the whole set of nondominated points in the outcome space. This article extends the geometric duality theory to convex vector optimization problems.Comment: 21 page

    Higher Weak Derivatives and Reflexive Algebras of Operators

    Full text link
    Let D be a self-adjoint operator on a Hilbert space H and x a bounded operator on H. We say that x is n-times weakly D-differentiable, if for any pair of vectors a, b from H the function is n-times differentiable. We give several characterizations of this property, among which one is original. The results are used to show, that for a von Neumann algebra M on H, the sub-algebra of n-times weakly D-differentiable operators has a representation as a reflexive algebra of operators on a bigger Hilbert space.Comment: This version acknowledges results from the litterature, which the first edition was unaware of. The result on the existence of a representation with a reflexive image is ne
    • …
    corecore