6,462 research outputs found

    Advances in Calibration and Imaging Techniques in Radio Interferometry

    Full text link
    This paper summarizes some of the major calibration and image reconstruction techniques used in radio interferometry and describes them in a common mathematical framework. The use of this framework has a number of benefits, ranging from clarification of the fundamentals, use of standard numerical optimization techniques, and generalization or specialization to new algorithms

    Revisiting the theory of interferometric wide-field synthesis

    Full text link
    After several generations of interferometers in radioastronomy, wide-field imaging at high angular resolution is today a major goal for trying to match optical wide-field performances. All the radio-interferometric, wide-field imaging methods currently belong to the mosaicking family. Based on a 30 years old, original idea from Ekers & Rots, we aim at proposing an alternate formalism. Starting from their ideal case, we successively evaluate the impact of the standard ingredients of interferometric imaging. A comparison with standard nonlinear mosaicking shows that both processing schemes are not mathematically equivalent, though they both recover the sky brightness. In particular, the weighting scheme is very different in both methods. Moreover, the proposed scheme naturally processes the short spacings from both single-dish antennas and heterogeneous arrays. Finally, the sky gridding of the measured visibilities, required by the proposed scheme, may potentially save large amounts of hard-disk space and cpu processing power over mosaicking when handling data sets acquired with the on-the-fly observing mode. We propose to call this promising family of imaging methods wide-field synthesis because it explicitly synthesizes visibilities at a much finer spatial frequency resolution than the one set by the diameter of the interferometer antennas.Comment: 22 pages, 6 PostScript figures. Accepted for publication in Astronomy & Astrophysics. Uses aa LaTeX macros

    Imaging and Nulling with the Space Interferometry Mission

    Get PDF
    We present numerical simulations for a possible synthesis imaging mode of the Space Interferometer Mission (SIM). We summarize the general techniques that SIM offers to perform imaging of high surface brightness sources, and discuss their strengths and weaknesses. We describe an interactive software package that is used to provide realistic, photometrically correct estimates of SIM performance for various classes of astronomical objects. In particular, we simulate the cases of gaseous disks around black holes in the nuclei of galaxies, and zodiacal dust disks around young stellar objects. Regarding the first, we show that a Keplerian velocity gradient of the line-emitting gaseous disk -- and thus the mass of the putative black hole -- can be determined with SIM to unprecedented accuracy in about 5 hours of integration time for objects with H_alpha surface brigthness comparable to the prototype M 87. Detections and observations of exo-zodiacal dust disks depend critically on the disk properties and the nulling capabilities of SIM. Systems with similar disk size and at least one tenth of the dust content of beta Pic can be detected by SIM at distances between 100 pc and a few kpc, if a nulling efficiency of 1/10000 is achieved. Possible inner clear regions indicative of the presence of massive planets can also be detected and imaged. On the other hand, exo-zodiacal disks with properties more similar to the solar system will not be found in reasonable integration times with SIM.Comment: 28 pages, incl. 8 postscript figures, excl. 10 gif-figures Submitted to Ap

    Image formation in synthetic aperture radio telescopes

    Full text link
    Next generation radio telescopes will be much larger, more sensitive, have much larger observation bandwidth and will be capable of pointing multiple beams simultaneously. Obtaining the sensitivity, resolution and dynamic range supported by the receivers requires the development of new signal processing techniques for array and atmospheric calibration as well as new imaging techniques that are both more accurate and computationally efficient since data volumes will be much larger. This paper provides a tutorial overview of existing image formation techniques and outlines some of the future directions needed for information extraction from future radio telescopes. We describe the imaging process from measurement equation until deconvolution, both as a Fourier inversion problem and as an array processing estimation problem. The latter formulation enables the development of more advanced techniques based on state of the art array processing. We demonstrate the techniques on simulated and measured radio telescope data.Comment: 12 page

    Mammographic image restoration using maximum entropy deconvolution

    Get PDF
    An image restoration approach based on a Bayesian maximum entropy method (MEM) has been applied to a radiological image deconvolution problem, that of reduction of geometric blurring in magnification mammography. The aim of the work is to demonstrate an improvement in image spatial resolution in realistic noisy radiological images with no associated penalty in terms of reduction in the signal-to-noise ratio perceived by the observer. Images of the TORMAM mammographic image quality phantom were recorded using the standard magnification settings of 1.8 magnification/fine focus and also at 1.8 magnification/broad focus and 3.0 magnification/fine focus; the latter two arrangements would normally give rise to unacceptable geometric blurring. Measured point-spread functions were used in conjunction with the MEM image processing to de-blur these images. The results are presented as comparative images of phantom test features and as observer scores for the raw and processed images. Visualization of high resolution features and the total image scores for the test phantom were improved by the application of the MEM processing. It is argued that this successful demonstration of image de-blurring in noisy radiological images offers the possibility of weakening the link between focal spot size and geometric blurring in radiology, thus opening up new approaches to system optimization.Comment: 18 pages, 10 figure

    The application of compressive sampling to radio astronomy I: Deconvolution

    Full text link
    Compressive sampling is a new paradigm for sampling, based on sparseness of signals or signal representations. It is much less restrictive than Nyquist-Shannon sampling theory and thus explains and systematises the widespread experience that methods such as the H\"ogbom CLEAN can violate the Nyquist-Shannon sampling requirements. In this paper, a CS-based deconvolution method for extended sources is introduced. This method can reconstruct both point sources and extended sources (using the isotropic undecimated wavelet transform as a basis function for the reconstruction step). We compare this CS-based deconvolution method with two CLEAN-based deconvolution methods: the H\"ogbom CLEAN and the multiscale CLEAN. This new method shows the best performance in deconvolving extended sources for both uniform and natural weighting of the sampled visibilities. Both visual and numerical results of the comparison are provided.Comment: Published by A&A, Matlab code can be found: http://code.google.com/p/csra/download

    Smear fitting: a new deconvolution method for interferometric data

    Full text link
    A new technique is presented for producing images from interferometric data. The method, ``smear fitting'', makes the constraints necessary for interferometric imaging double as a model, with uncertainties, of the sky brightness distribution. It does this by modelling the sky with a set of functions and then convolving each component with its own elliptical gaussian to account for the uncertainty in its shape and location that arises from noise. This yields much sharper resolution than CLEAN for significantly detected features, without sacrificing any sensitivity. Using appropriate functional forms for the components provides both a scientifically interesting model and imaging constraints that tend to be better than those used by traditional deconvolution methods. This allows it to avoid the most serious problems that limit the imaging quality of those methods. Comparisons of smear fitting to CLEAN and maximum entropy are given, using both real and simulated observations. It is also shown that the famous Rayleigh criterion (resolution = wavelength / baseline) is inappropriate for interferometers as it does not consider the reliability of the measurements.Comment: 16 pages, 38 figures (some have been lossily compressed for astro-ph). Uses the hyperref LaTeX package. Accepted for publication by the Monthly Notices of the Royal Astronomical Societ

    Jump-sparse and sparse recovery using Potts functionals

    Full text link
    We recover jump-sparse and sparse signals from blurred incomplete data corrupted by (possibly non-Gaussian) noise using inverse Potts energy functionals. We obtain analytical results (existence of minimizers, complexity) on inverse Potts functionals and provide relations to sparsity problems. We then propose a new optimization method for these functionals which is based on dynamic programming and the alternating direction method of multipliers (ADMM). A series of experiments shows that the proposed method yields very satisfactory jump-sparse and sparse reconstructions, respectively. We highlight the capability of the method by comparing it with classical and recent approaches such as TV minimization (jump-sparse signals), orthogonal matching pursuit, iterative hard thresholding, and iteratively reweighted â„“1\ell^1 minimization (sparse signals)
    • …
    corecore