2,124 research outputs found
Lorentzian Iterative Hard Thresholding: Robust Compressed Sensing with Prior Information
Commonly employed reconstruction algorithms in compressed sensing (CS) use
the norm as the metric for the residual error. However, it is well-known
that least squares (LS) based estimators are highly sensitive to outliers
present in the measurement vector leading to a poor performance when the noise
no longer follows the Gaussian assumption but, instead, is better characterized
by heavier-than-Gaussian tailed distributions. In this paper, we propose a
robust iterative hard Thresholding (IHT) algorithm for reconstructing sparse
signals in the presence of impulsive noise. To address this problem, we use a
Lorentzian cost function instead of the cost function employed by the
traditional IHT algorithm. We also modify the algorithm to incorporate prior
signal information in the recovery process. Specifically, we study the case of
CS with partially known support. The proposed algorithm is a fast method with
computational load comparable to the LS based IHT, whilst having the advantage
of robustness against heavy-tailed impulsive noise. Sufficient conditions for
stability are studied and a reconstruction error bound is derived. We also
derive sufficient conditions for stable sparse signal recovery with partially
known support. Theoretical analysis shows that including prior support
information relaxes the conditions for successful reconstruction. Simulation
results demonstrate that the Lorentzian-based IHT algorithm significantly
outperform commonly employed sparse reconstruction techniques in impulsive
environments, while providing comparable performance in less demanding,
light-tailed environments. Numerical results also demonstrate that the
partially known support inclusion improves the performance of the proposed
algorithm, thereby requiring fewer samples to yield an approximate
reconstruction.Comment: 28 pages, 9 figures, accepted in IEEE Transactions on Signal
Processin
PURIFY: a new algorithmic framework for next-generation radio-interferometric imaging
In recent works, compressed sensing (CS) and convex opti- mization techniques have been applied to radio-interferometric imaging showing the potential to outperform state-of-the-art imaging algorithms in the field. We review our latest contributions [1, 2, 3], which leverage the versatility of convex optimization to both handle realistic continuous visibilities and offer a highly parallelizable structure paving the way to significant acceleration of the reconstruction and high-dimensional data scalability. The new algorithmic structure promoted in a new software PURIFY (beta version) relies on the simultaneous-direction method of multipliers (SDMM). The performance of various sparsity priors is evaluated through simulations in the continuous visibility setting, confirming the superiority of our recent average sparsity approach SARA
On sparsity averaging
Recent developments in Carrillo et al. (2012) and Carrillo et al. (2013)
introduced a novel regularization method for compressive imaging in the context
of compressed sensing with coherent redundant dictionaries. The approach relies
on the observation that natural images exhibit strong average sparsity over
multiple coherent frames. The associated reconstruction algorithm, based on an
analysis prior and a reweighted scheme, is dubbed Sparsity Averaging
Reweighted Analysis (SARA). We review these advances and extend associated
simulations establishing the superiority of SARA to regularization methods
based on sparsity in a single frame, for a generic spread spectrum acquisition
and for a Fourier acquisition of particular interest in radio astronomy.Comment: 4 pages, 3 figures, Proceedings of 10th International Conference on
Sampling Theory and Applications (SampTA), Code available at
https://github.com/basp-group/sopt, Full journal letter available at
http://arxiv.org/abs/arXiv:1208.233
PURIFY: a new approach to radio-interferometric imaging
In a recent article series, the authors have promoted convex optimization algorithms for radio-interferometric imaging in the framework of compressed sensing, which leverages sparsity regularization priors for the associated inverse problem and defines a minimization problem for image reconstruction. This approach was shown, in theory and through simulations in a simple discrete visibility setting, to have the potential to outperform significantly CLEAN and its evolutions. In this work, we leverage the versatility of convex optimization in solving minimization problems to both handle realistic continuous visibilities and offer a highly parallelizable structure paving the way to significant acceleration of the reconstruction and high-dimensional data scalability. The new algorithmic structure promoted relies on the simultaneous-direction method of multipliers (SDMM), and contrasts with the current major-minor cycle structure of CLEAN and its evolutions, which in particular cannot handle the state-of-the-art minimization problems under consideration where neither the regularization term nor the data term are differentiable functions. We release a beta version of an SDMM-based imaging software written in C and dubbed PURIFY (http://basp-group.github.io/purify/) that handles various sparsity priors, including our recent average sparsity approach SARA. We evaluate the performance of different priors through simulations in the continuous visibility setting, confirming the superiority of SARA
A randomised primal-dual algorithm for distributed radio-interferometric imaging
Next generation radio telescopes, like the Square Kilometre Array, will
acquire an unprecedented amount of data for radio astronomy. The development of
fast, parallelisable or distributed algorithms for handling such large-scale
data sets is of prime importance. Motivated by this, we investigate herein a
convex optimisation algorithmic structure, based on primal-dual
forward-backward iterations, for solving the radio interferometric imaging
problem. It can encompass any convex prior of interest. It allows for the
distributed processing of the measured data and introduces further flexibility
by employing a probabilistic approach for the selection of the data blocks used
at a given iteration. We study the reconstruction performance with respect to
the data distribution and we propose the use of nonuniform probabilities for
the randomised updates. Our simulations show the feasibility of the
randomisation given a limited computing infrastructure as well as important
computational advantages when compared to state-of-the-art algorithmic
structures.Comment: 5 pages, 3 figures, Proceedings of the European Signal Processing
Conference (EUSIPCO) 2016, Related journal publication available at
https://arxiv.org/abs/1601.0402
Exploiting Prior Knowledge in Compressed Sensing Wireless ECG Systems
Recent results in telecardiology show that compressed sensing (CS) is a
promising tool to lower energy consumption in wireless body area networks for
electrocardiogram (ECG) monitoring. However, the performance of current
CS-based algorithms, in terms of compression rate and reconstruction quality of
the ECG, still falls short of the performance attained by state-of-the-art
wavelet based algorithms. In this paper, we propose to exploit the structure of
the wavelet representation of the ECG signal to boost the performance of
CS-based methods for compression and reconstruction of ECG signals. More
precisely, we incorporate prior information about the wavelet dependencies
across scales into the reconstruction algorithms and exploit the high fraction
of common support of the wavelet coefficients of consecutive ECG segments.
Experimental results utilizing the MIT-BIH Arrhythmia Database show that
significant performance gains, in terms of compression rate and reconstruction
quality, can be obtained by the proposed algorithms compared to current
CS-based methods.Comment: Accepted for publication at IEEE Journal of Biomedical and Health
Informatic
The varying w spread spectrum effect for radio interferometric imaging
We study the impact of the spread spectrum effect in radio interferometry on
the quality of image reconstruction. This spread spectrum effect will be
induced by the wide field-of-view of forthcoming radio interferometric
telescopes. The resulting chirp modulation improves the quality of
reconstructed interferometric images by increasing the incoherence of the
measurement and sparsity dictionaries. We extend previous studies of this
effect to consider the more realistic setting where the chirp modulation varies
for each visibility measurement made by the telescope. In these first
preliminary results, we show that for this setting the quality of
reconstruction improves significantly over the case without chirp modulation
and achieves almost the reconstruction quality of the case of maximal, constant
chirp modulation.Comment: 1 page, 1 figure, Proceedings of the Biomedical and Astronomical
Signal Processing Frontiers (BASP) workshop 201
Robust sparse image reconstruction of radio interferometric observations with purify
Next-generation radio interferometers, such as the Square Kilometre Array
(SKA), will revolutionise our understanding of the universe through their
unprecedented sensitivity and resolution. However, to realise these goals
significant challenges in image and data processing need to be overcome. The
standard methods in radio interferometry for reconstructing images, such as
CLEAN, have served the community well over the last few decades and have
survived largely because they are pragmatic. However, they produce
reconstructed inter\-ferometric images that are limited in quality and
scalability for big data. In this work we apply and evaluate alternative
interferometric reconstruction methods that make use of state-of-the-art sparse
image reconstruction algorithms motivated by compressive sensing, which have
been implemented in the PURIFY software package. In particular, we implement
and apply the proximal alternating direction method of multipliers (P-ADMM)
algorithm presented in a recent article. First, we assess the impact of the
interpolation kernel used to perform gridding and degridding on sparse image
reconstruction. We find that the Kaiser-Bessel interpolation kernel performs as
well as prolate spheroidal wave functions, while providing a computational
saving and an analytic form. Second, we apply PURIFY to real interferometric
observations from the Very Large Array (VLA) and the Australia Telescope
Compact Array (ATCA) and find images recovered by PURIFY are higher quality
than those recovered by CLEAN. Third, we discuss how PURIFY reconstructions
exhibit additional advantages over those recovered by CLEAN. The latest version
of PURIFY, with developments presented in this work, is made publicly
available.Comment: 22 pages, 10 figures, PURIFY code available at
http://basp-group.github.io/purif
Evaluation of the end-of-life of electric vehicle batteries according to the state-of-health
As a result of monitoring thousands of electric vehicle charges around Europe, this study builds statistical distributions that model the amount of energy necessary for trips between charges, showing that most of trips are within the range of electric vehicle even when the battery degradation reaches the end-of-life, commonly accepted to be 80% State of Health. According to these results, this study analyses how far this End-of-Life can be pushed forward using statistical methods and indicating the provability of failing to fulfill the electric vehicle (EV) owners’ daily trip needsPeer ReviewedPostprint (published version
- …