559 research outputs found
On the stable recovery of the sparsest overcomplete representations in presence of noise
Let x be a signal to be sparsely decomposed over a redundant dictionary A,
i.e., a sparse coefficient vector s has to be found such that x=As. It is known
that this problem is inherently unstable against noise, and to overcome this
instability, the authors of [Stable Recovery; Donoho et.al., 2006] have
proposed to use an "approximate" decomposition, that is, a decomposition
satisfying ||x - A s|| < \delta, rather than satisfying the exact equality x =
As. Then, they have shown that if there is a decomposition with ||s||_0 <
(1+M^{-1})/2, where M denotes the coherence of the dictionary, this
decomposition would be stable against noise. On the other hand, it is known
that a sparse decomposition with ||s||_0 < spark(A)/2 is unique. In other
words, although a decomposition with ||s||_0 < spark(A)/2 is unique, its
stability against noise has been proved only for highly more restrictive
decompositions satisfying ||s||_0 < (1+M^{-1})/2, because usually (1+M^{-1})/2
<< spark(A)/2.
This limitation maybe had not been very important before, because ||s||_0 <
(1+M^{-1})/2 is also the bound which guaranties that the sparse decomposition
can be found via minimizing the L1 norm, a classic approach for sparse
decomposition. However, with the availability of new algorithms for sparse
decomposition, namely SL0 and Robust-SL0, it would be important to know whether
or not unique sparse decompositions with (1+M^{-1})/2 < ||s||_0 < spark(A)/2
are stable. In this paper, we show that such decompositions are indeed stable.
In other words, we extend the stability bound from ||s||_0 < (1+M^{-1})/2 to
the whole uniqueness range ||s||_0 < spark(A)/2. In summary, we show that "all
unique sparse decompositions are stably recoverable". Moreover, we see that
sparser decompositions are "more stable".Comment: Accepted in IEEE Trans on SP on 4 May 2010. (c) 2010 IEEE. Personal
use of this material is permitted. Permission from IEEE must be obtained for
all other users, including reprinting/republishing this material for
advertising or promotional purposes, creating new collective works for resale
or redistribution to servers or lists, or reuse of any copyrighted components
of this work in other work
Dynamical spectral unmixing of multitemporal hyperspectral images
In this paper, we consider the problem of unmixing a time series of
hyperspectral images. We propose a dynamical model based on linear mixing
processes at each time instant. The spectral signatures and fractional
abundances of the pure materials in the scene are seen as latent variables, and
assumed to follow a general dynamical structure. Based on a simplified version
of this model, we derive an efficient spectral unmixing algorithm to estimate
the latent variables by performing alternating minimizations. The performance
of the proposed approach is demonstrated on synthetic and real multitemporal
hyperspectral images.Comment: 13 pages, 10 figure
A fast approach for overcomplete sparse decomposition based on smoothed L0 norm
In this paper, a fast algorithm for overcomplete sparse decomposition, called
SL0, is proposed. The algorithm is essentially a method for obtaining sparse
solutions of underdetermined systems of linear equations, and its applications
include underdetermined Sparse Component Analysis (SCA), atomic decomposition
on overcomplete dictionaries, compressed sensing, and decoding real field
codes. Contrary to previous methods, which usually solve this problem by
minimizing the L1 norm using Linear Programming (LP) techniques, our algorithm
tries to directly minimize the L0 norm. It is experimentally shown that the
proposed algorithm is about two to three orders of magnitude faster than the
state-of-the-art interior-point LP solvers, while providing the same (or
better) accuracy.Comment: Accepted in IEEE Transactions on Signal Processing. For MATLAB codes,
see (http://ee.sharif.ir/~SLzero). File replaced, because Fig. 5 was missing
erroneousl
ECG Denoising using Angular Velocity as a State and an Observation in an Extended Kalman Filter Framework
International audienceIn this paper an efficient filtering procedure based on Extended Kalman Filter (EKF) has been proposed. The method is based on a modified nonlinear dynamic model, previously introduced for the generation of synthetic ECG signals. The proposed method considers the angular velocity of ECG signal, as one of the states of an EKF. We have considered two cases for observation equations, in one case we have assumed a corresponding observation to angular velocity state and in the other case, we have not assumed any observations for it. Quantitative evaluation of the proposed algorithm on the MIT-BIH Normal Sinus Rhythm Database (NSRDB) shows that an average SNR improvement of 8 dB is achieved for an input signal of -4 dB
Joint Independent Subspace Analysis: A Quasi-Newton Algorithm
International audienceIn this paper, we present a quasi-Newton (QN) algorithm for joint independent subspace analysis (JISA). JISA is a recently proposed generalization of independent vector analysis (IVA). JISA extends classical blind source separation (BSS) to jointly resolve several BSS problems by exploiting statistical dependence between latent sources across mixtures, as well as relaxing the assumption of statistical independence within each mixture. Algebraically, JISA based on second-order statistics amounts to coupled block diagonalization of a set of covariance and cross-covariance matrices, as well as block diagonalization of a single permuted covariance matrix. The proposed QN algorithm achieves asymptotically the minimal mean square error (MMSE) in the separation of multidimensional Gaussian components. Numerical experiments demonstrate convergence and source separation properties of the proposed algorithm
Joint Independent Subspace Analysis Using Second-Order Statistics
International audienceThis paper deals with a novel generalization of classical blind source separation (BSS) in two directions. First, relaxing the constraint that the latent sources must be statistically independent. This generalization is well-known and sometimes termed independent subspace analysis (ISA). Second, jointly analyzing several ISA problems, where the link is due to statistical dependence among corresponding sources in different mixtures. When the data are one-dimensional, i.e., multiple classical BSS problems, this model, known as independent vector analysis (IVA), has already been studied. In this paper, we combine IVA with ISA and term this new model joint independent subspace analysis (JISA). We provide full performance analysis of JISA, including closed-form expressions for minimal mean square error (MSE), Fisher information and Cramér-Rao lower bound, in the separation of Gaussian data. The derived MSE applies also for non-Gaussian data, when only second-order statistics are used. We generalize previously known results on IVA, including its ability to uniquely resolve instantaneous mixtures of real Gaussian stationary data, and having the same arbitrary permutation at all mixtures. Numerical experiments validate our theoretical results and show the gain with respect to two competing approaches that either use a finer block partition or a different norm
- …