1,298 research outputs found
The application of compressive sampling to radio astronomy I: Deconvolution
Compressive sampling is a new paradigm for sampling, based on sparseness of
signals or signal representations. It is much less restrictive than
Nyquist-Shannon sampling theory and thus explains and systematises the
widespread experience that methods such as the H\"ogbom CLEAN can violate the
Nyquist-Shannon sampling requirements. In this paper, a CS-based deconvolution
method for extended sources is introduced. This method can reconstruct both
point sources and extended sources (using the isotropic undecimated wavelet
transform as a basis function for the reconstruction step). We compare this
CS-based deconvolution method with two CLEAN-based deconvolution methods: the
H\"ogbom CLEAN and the multiscale CLEAN. This new method shows the best
performance in deconvolving extended sources for both uniform and natural
weighting of the sampled visibilities. Both visual and numerical results of the
comparison are provided.Comment: Published by A&A, Matlab code can be found:
http://code.google.com/p/csra/download
Multi-Scale CLEAN deconvolution of radio synthesis images
Radio synthesis imaging is dependent upon deconvolution algorithms to
counteract the sparse sampling of the Fourier plane. These deconvolution
algorithms find an estimate of the true sky brightness from the necessarily
incomplete sampled visibility data. The most widely used radio synthesis
deconvolution method is the CLEAN algorithm of Hogbom. This algorithm works
extremely well for collections of point sources and surprisingly well for
extended objects. However, the performance for extended objects can be improved
by adopting a multi-scale approach. We describe and demonstrate a conceptually
simple and algorithmically straightforward extension to CLEAN that models the
sky brightness by the summation of components of emission having different size
scales. While previous multiscale algorithms work sequentially on decreasing
scale sizes, our algorithm works simultaneously on a range of specified scales.
Applications to both real and simulated data sets are given.Comment: Submitted to IEEE Special Issue on Signal Processin
Joint compressive sampling and deconvolution in ultrasound medical imaging
International audienceThe interest of compressive sampling and image deconvolution has been extensively explored in the ultrasound imaging literature. The first seeks to reduce the volume of acquired data and/or to accelerate the frame rate. The second aims at improving the ultrasound image quality in terms of spatial resolution, contrast and signal to noise ratio. In this paper, we propose a novel approach combining these two frameworks, resulting into a compressive deconvolution technique aiming at obtaining high quality reconstructions from a reduced number of measurements. The resulting inverse problem is solved by minimizing an objective function taking into account the data attachment term and two appropriate prior information terms adapted to ultrasound imaging
Compressively characterizing high-dimensional entangled states with complementary, random filtering
The resources needed to conventionally characterize a quantum system are
overwhelmingly large for high- dimensional systems. This obstacle may be
overcome by abandoning traditional cornerstones of quantum measurement, such as
general quantum states, strong projective measurement, and assumption-free
characterization. Following this reasoning, we demonstrate an efficient
technique for characterizing high-dimensional, spatial entanglement with one
set of measurements. We recover sharp distributions with local, random
filtering of the same ensemble in momentum followed by position---something the
uncertainty principle forbids for projective measurements. Exploiting the
expectation that entangled signals are highly correlated, we use fewer than
5,000 measurements to characterize a 65, 536-dimensional state. Finally, we use
entropic inequalities to witness entanglement without a density matrix. Our
method represents the sea change unfolding in quantum measurement where methods
influenced by the information theory and signal-processing communities replace
unscalable, brute-force techniques---a progression previously followed by
classical sensing.Comment: 13 pages, 7 figure
- …