Robust sparse analysis regularization

Abstract

ABSTRACT This work studies some properties of 1 -analysis regularization for the resolution of linear inverse problems. Analysis regularization minimizes the 1 norm of the correlations between the signal and the atoms in the dictionary. The corresponding variational problem includes several well-known regularizations such as the discrete total variation and the fused lasso. We give sufficient conditions such that analysis regularization is robust to noise. ANALYSIS VERSUS SYNTHESIS Regularization through variational analysis is a popular way to compute an approximation of x 0 ∈ R N from the measurements y ∈ R Q as defined by an inverse problem y = Φx 0 + w where w is some additive noise and Φ is a linear operator, for instance a super-resolution or an inpainting operator. N which is used to synthesize a signal Common examples in signal processing of dictionary include the wavelet transform or a finite-difference operator. Synthesis regularization corresponds to the following minimization problem where Ψ = ΦD, and x = Dα. Properties of synthesis prior had been studied intensively, see for instance Analysis regularization corresponds to the following minimization problem In the noiseless case, w = 0, one uses the constrained optimization which reads min x∈R N ||D * x|| 1 subject to Φx = y. This prior had been less studied than the synthesis prior, see for instance UNION OF SUBSPACES MODEL It is natural to keep track of the support of this correlation vector, as done in the following definition. A signal x such that D * x is sparse lives in a cospace G J of small dimension where G J is defined as follow. Definition 2. Given a dictionary D, and J a subset of {1 · · · P }, the cospace G J is defined as where D J is the subdictionary whose columns are indexed by J. The signal space can thus be decomposed as a union of subspaces of increasing dimensions For the 1-D total variation prior, Θ k is the set of piecewise constant signals with k − 1 steps

    Similar works