476 research outputs found

    An overview of robust compressive sensing of sparse signals in impulsive noise

    Get PDF
    While compressive sensing (CS) has traditionally relied on L2 as an error norm, a broad spectrum of applications has emerged where robust estimators are required. Among those, applications where the sampling process is performed in the presence of impulsive noise, or where the sampling of the high-dimensional sparse signals requires the preservation of a distance different than L2. This article overviews robust sampling and nonlinear reconstruction strategies for sparse signals based on the Cauchy distribution and the Lorentzian norm for the data fidelity. The derived methods outperform existing compressed sensing techniques in impulsive environ- ments, thus offering a robust framework for CS

    Lorentzian Iterative Hard Thresholding: Robust Compressed Sensing with Prior Information

    Full text link
    Commonly employed reconstruction algorithms in compressed sensing (CS) use the L2L_2 norm as the metric for the residual error. However, it is well-known that least squares (LS) based estimators are highly sensitive to outliers present in the measurement vector leading to a poor performance when the noise no longer follows the Gaussian assumption but, instead, is better characterized by heavier-than-Gaussian tailed distributions. In this paper, we propose a robust iterative hard Thresholding (IHT) algorithm for reconstructing sparse signals in the presence of impulsive noise. To address this problem, we use a Lorentzian cost function instead of the L2L_2 cost function employed by the traditional IHT algorithm. We also modify the algorithm to incorporate prior signal information in the recovery process. Specifically, we study the case of CS with partially known support. The proposed algorithm is a fast method with computational load comparable to the LS based IHT, whilst having the advantage of robustness against heavy-tailed impulsive noise. Sufficient conditions for stability are studied and a reconstruction error bound is derived. We also derive sufficient conditions for stable sparse signal recovery with partially known support. Theoretical analysis shows that including prior support information relaxes the conditions for successful reconstruction. Simulation results demonstrate that the Lorentzian-based IHT algorithm significantly outperform commonly employed sparse reconstruction techniques in impulsive environments, while providing comparable performance in less demanding, light-tailed environments. Numerical results also demonstrate that the partially known support inclusion improves the performance of the proposed algorithm, thereby requiring fewer samples to yield an approximate reconstruction.Comment: 28 pages, 9 figures, accepted in IEEE Transactions on Signal Processin

    Multiband Spectrum Access: Great Promises for Future Cognitive Radio Networks

    Full text link
    Cognitive radio has been widely considered as one of the prominent solutions to tackle the spectrum scarcity. While the majority of existing research has focused on single-band cognitive radio, multiband cognitive radio represents great promises towards implementing efficient cognitive networks compared to single-based networks. Multiband cognitive radio networks (MB-CRNs) are expected to significantly enhance the network's throughput and provide better channel maintenance by reducing handoff frequency. Nevertheless, the wideband front-end and the multiband spectrum access impose a number of challenges yet to overcome. This paper provides an in-depth analysis on the recent advancements in multiband spectrum sensing techniques, their limitations, and possible future directions to improve them. We study cooperative communications for MB-CRNs to tackle a fundamental limit on diversity and sampling. We also investigate several limits and tradeoffs of various design parameters for MB-CRNs. In addition, we explore the key MB-CRNs performance metrics that differ from the conventional metrics used for single-band based networks.Comment: 22 pages, 13 figures; published in the Proceedings of the IEEE Journal, Special Issue on Future Radio Spectrum Access, March 201

    Convexity in source separation: Models, geometry, and algorithms

    Get PDF
    Source separation or demixing is the process of extracting multiple components entangled within a signal. Contemporary signal processing presents a host of difficult source separation problems, from interference cancellation to background subtraction, blind deconvolution, and even dictionary learning. Despite the recent progress in each of these applications, advances in high-throughput sensor technology place demixing algorithms under pressure to accommodate extremely high-dimensional signals, separate an ever larger number of sources, and cope with more sophisticated signal and mixing models. These difficulties are exacerbated by the need for real-time action in automated decision-making systems. Recent advances in convex optimization provide a simple framework for efficiently solving numerous difficult demixing problems. This article provides an overview of the emerging field, explains the theory that governs the underlying procedures, and surveys algorithms that solve them efficiently. We aim to equip practitioners with a toolkit for constructing their own demixing algorithms that work, as well as concrete intuition for why they work

    Robust Subspace Tracking Algorithms in Signal Processing: A Brief Survey

    Get PDF
    Principal component analysis (PCA) and subspace estimation (SE) are popular data analysis tools and used in a wide range of applications. The main interest in PCA/SE is for dimensionality reduction and low-rank approximation purposes. The emergence of big data streams have led to several essential issues for performing PCA/SE. Among them are (i) the size of such data streams increases over time, (ii) the underlying models may be time-dependent, and (iii) problem of dealing with the uncertainty and incompleteness in data. A robust variant of PCA/SE for such data streams, namely robust online PCA or robust subspace tracking (RST), has been introduced as a good alternative. The main goal of this paper is to provide a brief survey on recent RST algorithms in signal processing. Particularly, we begin this survey by introducing the basic ideas of the RST problem. Then, different aspects of RST are reviewed with respect to different kinds of non-Gaussian noises and sparse constraints. Our own contributions on this topic are also highlighted
    • …
    corecore