230 research outputs found
PURIFY: a new algorithmic framework for next-generation radio-interferometric imaging
In recent works, compressed sensing (CS) and convex opti- mization techniques have been applied to radio-interferometric imaging showing the potential to outperform state-of-the-art imaging algorithms in the field. We review our latest contributions [1, 2, 3], which leverage the versatility of convex optimization to both handle realistic continuous visibilities and offer a highly parallelizable structure paving the way to significant acceleration of the reconstruction and high-dimensional data scalability. The new algorithmic structure promoted in a new software PURIFY (beta version) relies on the simultaneous-direction method of multipliers (SDMM). The performance of various sparsity priors is evaluated through simulations in the continuous visibility setting, confirming the superiority of our recent average sparsity approach SARA
The varying w spread spectrum effect for radio interferometric imaging
We study the impact of the spread spectrum effect in radio interferometry on
the quality of image reconstruction. This spread spectrum effect will be
induced by the wide field-of-view of forthcoming radio interferometric
telescopes. The resulting chirp modulation improves the quality of
reconstructed interferometric images by increasing the incoherence of the
measurement and sparsity dictionaries. We extend previous studies of this
effect to consider the more realistic setting where the chirp modulation varies
for each visibility measurement made by the telescope. In these first
preliminary results, we show that for this setting the quality of
reconstruction improves significantly over the case without chirp modulation
and achieves almost the reconstruction quality of the case of maximal, constant
chirp modulation.Comment: 1 page, 1 figure, Proceedings of the Biomedical and Astronomical
Signal Processing Frontiers (BASP) workshop 201
On sparsity averaging
Recent developments in Carrillo et al. (2012) and Carrillo et al. (2013)
introduced a novel regularization method for compressive imaging in the context
of compressed sensing with coherent redundant dictionaries. The approach relies
on the observation that natural images exhibit strong average sparsity over
multiple coherent frames. The associated reconstruction algorithm, based on an
analysis prior and a reweighted scheme, is dubbed Sparsity Averaging
Reweighted Analysis (SARA). We review these advances and extend associated
simulations establishing the superiority of SARA to regularization methods
based on sparsity in a single frame, for a generic spread spectrum acquisition
and for a Fourier acquisition of particular interest in radio astronomy.Comment: 4 pages, 3 figures, Proceedings of 10th International Conference on
Sampling Theory and Applications (SampTA), Code available at
https://github.com/basp-group/sopt, Full journal letter available at
http://arxiv.org/abs/arXiv:1208.233
PURIFY: a new approach to radio-interferometric imaging
In a recent article series, the authors have promoted convex optimization algorithms for radio-interferometric imaging in the framework of compressed sensing, which leverages sparsity regularization priors for the associated inverse problem and defines a minimization problem for image reconstruction. This approach was shown, in theory and through simulations in a simple discrete visibility setting, to have the potential to outperform significantly CLEAN and its evolutions. In this work, we leverage the versatility of convex optimization in solving minimization problems to both handle realistic continuous visibilities and offer a highly parallelizable structure paving the way to significant acceleration of the reconstruction and high-dimensional data scalability. The new algorithmic structure promoted relies on the simultaneous-direction method of multipliers (SDMM), and contrasts with the current major-minor cycle structure of CLEAN and its evolutions, which in particular cannot handle the state-of-the-art minimization problems under consideration where neither the regularization term nor the data term are differentiable functions. We release a beta version of an SDMM-based imaging software written in C and dubbed PURIFY (http://basp-group.github.io/purify/) that handles various sparsity priors, including our recent average sparsity approach SARA. We evaluate the performance of different priors through simulations in the continuous visibility setting, confirming the superiority of SARA
Compressed sensing for wide-field radio interferometric imaging
For the next generation of radio interferometric telescopes it is of
paramount importance to incorporate wide field-of-view (WFOV) considerations in
interferometric imaging, otherwise the fidelity of reconstructed images will
suffer greatly. We extend compressed sensing techniques for interferometric
imaging to a WFOV and recover images in the spherical coordinate space in which
they naturally live, eliminating any distorting projection. The effectiveness
of the spread spectrum phenomenon, highlighted recently by one of the authors,
is enhanced when going to a WFOV, while sparsity is promoted by recovering
images directly on the sphere. Both of these properties act to improve the
quality of reconstructed interferometric images. We quantify the performance of
compressed sensing reconstruction techniques through simulations, highlighting
the superior reconstruction quality achieved by recovering interferometric
images directly on the sphere rather than the plane.Comment: 15 pages, 8 figures, replaced to match version accepted by MNRA
A randomised primal-dual algorithm for distributed radio-interferometric imaging
Next generation radio telescopes, like the Square Kilometre Array, will
acquire an unprecedented amount of data for radio astronomy. The development of
fast, parallelisable or distributed algorithms for handling such large-scale
data sets is of prime importance. Motivated by this, we investigate herein a
convex optimisation algorithmic structure, based on primal-dual
forward-backward iterations, for solving the radio interferometric imaging
problem. It can encompass any convex prior of interest. It allows for the
distributed processing of the measured data and introduces further flexibility
by employing a probabilistic approach for the selection of the data blocks used
at a given iteration. We study the reconstruction performance with respect to
the data distribution and we propose the use of nonuniform probabilities for
the randomised updates. Our simulations show the feasibility of the
randomisation given a limited computing infrastructure as well as important
computational advantages when compared to state-of-the-art algorithmic
structures.Comment: 5 pages, 3 figures, Proceedings of the European Signal Processing
Conference (EUSIPCO) 2016, Related journal publication available at
https://arxiv.org/abs/1601.0402
- …