14,853 research outputs found
Reliable recovery of hierarchically sparse signals for Gaussian and Kronecker product measurements
We propose and analyze a solution to the problem of recovering a block sparse
signal with sparse blocks from linear measurements. Such problems naturally
emerge inter alia in the context of mobile communication, in order to meet the
scalability and low complexity requirements of massive antenna systems and
massive machine-type communication. We introduce a new variant of the Hard
Thresholding Pursuit (HTP) algorithm referred to as HiHTP. We provide both a
proof of convergence and a recovery guarantee for noisy Gaussian measurements
that exhibit an improved asymptotic scaling in terms of the sampling complexity
in comparison with the usual HTP algorithm. Furthermore, hierarchically sparse
signals and Kronecker product structured measurements naturally arise together
in a variety of applications. We establish the efficient reconstruction of
hierarchically sparse signals from Kronecker product measurements using the
HiHTP algorithm. Additionally, we provide analytical results that connect our
recovery conditions to generalized coherence measures. Again, our recovery
results exhibit substantial improvement in the asymptotic sampling complexity
scaling over the standard setting. Finally, we validate in numerical
experiments that for hierarchically sparse signals, HiHTP performs
significantly better compared to HTP.Comment: 11+4 pages, 5 figures. V3: Incomplete funding information corrected
and minor typos corrected. V4: Change of title and additional author Axel
Flinth. Included new results on Kronecker product measurements and relations
of HiRIP to hierarchical coherence measures. Improved presentation of general
hierarchically sparse signals and correction of minor typo
Lorentzian Iterative Hard Thresholding: Robust Compressed Sensing with Prior Information
Commonly employed reconstruction algorithms in compressed sensing (CS) use
the norm as the metric for the residual error. However, it is well-known
that least squares (LS) based estimators are highly sensitive to outliers
present in the measurement vector leading to a poor performance when the noise
no longer follows the Gaussian assumption but, instead, is better characterized
by heavier-than-Gaussian tailed distributions. In this paper, we propose a
robust iterative hard Thresholding (IHT) algorithm for reconstructing sparse
signals in the presence of impulsive noise. To address this problem, we use a
Lorentzian cost function instead of the cost function employed by the
traditional IHT algorithm. We also modify the algorithm to incorporate prior
signal information in the recovery process. Specifically, we study the case of
CS with partially known support. The proposed algorithm is a fast method with
computational load comparable to the LS based IHT, whilst having the advantage
of robustness against heavy-tailed impulsive noise. Sufficient conditions for
stability are studied and a reconstruction error bound is derived. We also
derive sufficient conditions for stable sparse signal recovery with partially
known support. Theoretical analysis shows that including prior support
information relaxes the conditions for successful reconstruction. Simulation
results demonstrate that the Lorentzian-based IHT algorithm significantly
outperform commonly employed sparse reconstruction techniques in impulsive
environments, while providing comparable performance in less demanding,
light-tailed environments. Numerical results also demonstrate that the
partially known support inclusion improves the performance of the proposed
algorithm, thereby requiring fewer samples to yield an approximate
reconstruction.Comment: 28 pages, 9 figures, accepted in IEEE Transactions on Signal
Processin
Structured random measurements in signal processing
Compressed sensing and its extensions have recently triggered interest in
randomized signal acquisition. A key finding is that random measurements
provide sparse signal reconstruction guarantees for efficient and stable
algorithms with a minimal number of samples. While this was first shown for
(unstructured) Gaussian random measurement matrices, applications require
certain structure of the measurements leading to structured random measurement
matrices. Near optimal recovery guarantees for such structured measurements
have been developed over the past years in a variety of contexts. This article
surveys the theory in three scenarios: compressed sensing (sparse recovery),
low rank matrix recovery, and phaseless estimation. The random measurement
matrices to be considered include random partial Fourier matrices, partial
random circulant matrices (subsampled convolutions), matrix completion, and
phase estimation from magnitudes of Fourier type measurements. The article
concludes with a brief discussion of the mathematical techniques for the
analysis of such structured random measurements.Comment: 22 pages, 2 figure
- …