132 research outputs found

    Time-Regularized Blind Deconvolution Approach for Radio Interferometry

    Get PDF

    Image Reconstruction in Optical Interferometry

    Full text link
    This tutorial paper describes the problem of image reconstruction from interferometric data with a particular focus on the specific problems encountered at optical (visible/IR) wavelengths. The challenging issues in image reconstruction from interferometric data are introduced in the general framework of inverse problem approach. This framework is then used to describe existing image reconstruction algorithms in radio interferometry and the new methods specifically developed for optical interferometry.Comment: accepted for publication in IEEE Signal Processing Magazin

    Blind Demixing for Low-Latency Communication

    Full text link
    In the next generation wireless networks, lowlatency communication is critical to support emerging diversified applications, e.g., Tactile Internet and Virtual Reality. In this paper, a novel blind demixing approach is developed to reduce the channel signaling overhead, thereby supporting low-latency communication. Specifically, we develop a low-rank approach to recover the original information only based on a single observed vector without any channel estimation. Unfortunately, this problem turns out to be a highly intractable non-convex optimization problem due to the multiple non-convex rankone constraints. To address the unique challenges, the quotient manifold geometry of product of complex asymmetric rankone matrices is exploited by equivalently reformulating original complex asymmetric matrices to the Hermitian positive semidefinite matrices. We further generalize the geometric concepts of the complex product manifolds via element-wise extension of the geometric concepts of the individual manifolds. A scalable Riemannian trust-region algorithm is then developed to solve the blind demixing problem efficiently with fast convergence rates and low iteration cost. Numerical results will demonstrate the algorithmic advantages and admirable performance of the proposed algorithm compared with the state-of-art methods.Comment: 14 pages, accepted by IEEE Transaction on Wireless Communicatio

    A non-convex perspective on calibration and imaging in radio interferometry

    Get PDF

    Deep Radio Interferometric Imaging with POLISH: DSA-2000 and weak lensing

    Full text link
    Radio interferometry allows astronomers to probe small spatial scales that are often inaccessible with single-dish instruments. However, recovering the radio sky from an interferometer is an ill-posed deconvolution problem that astronomers have worked on for half a century. More challenging still is achieving resolution below the array's diffraction limit, known as super-resolution imaging. To this end, we have developed a new learning-based approach for radio interferometric imaging, leveraging recent advances in the classical computer vision problems of single-image super-resolution (SISR) and deconvolution. We have developed and trained a high dynamic range residual neural network to learn the mapping between the dirty image and the true radio sky. We call this procedure POLISH, in contrast to the traditional CLEAN algorithm. The feed forward nature of learning-based approaches like POLISH is critical for analyzing data from the upcoming Deep Synoptic Array (DSA-2000). We show that POLISH achieves super-resolution, and we demonstrate its ability to deconvolve real observations from the Very Large Array (VLA). Super-resolution on DSA-2000 will allow us to measure the shapes and orientations of several hundred million star forming radio galaxies (SFGs), making it a powerful cosmological weak lensing survey and probe of dark energy. We forecast its ability to constrain the lensing power spectrum, finding that it will be complementary to next-generation optical surveys such as Euclid

    Identifying synergies between VLBI and STIX imaging

    Full text link
    Reconstructing an image from sparsely sampled Fourier data is an ill-posed inverse problem that occurs in a variety of subjects within science, including the data analysis for Very Long Baseline Interferometry (VLBI) and the Spectrometer/Telescope for Imaging X-rays (STIX) for solar observations. Despite ongoing parallel developments of novel imaging algorithms, synergies remain unexplored. We study the synergies between the data analysis for the STIX instrument and VLBI, compare the methodologies and evaluate their potential. In this way, we identify key trends in the performance of several algorithmic ideas and draw recommendations for the future. To this end, we organized a semi-blind imaging challenge with data sets and source structures that are typical for sparse VLBI, specifically in the context of the Event Horizon Telescope (EHT), and for STIX observations. 17 different algorithms from both communities, from 6 different imaging frameworks, participated in the challenge, marking this work the largest scale code comparisons for STIX and VLBI to date. Strong synergies between the two communities have been identified, as can be proven by the success of the imaging methods proposed for STIX in imaging VLBI data sets and vice versa. Novel imaging methods outperform the standard CLEAN algorithm significantly in every test-case. Improvements over the performance of CLEAN make deeper updates to the inverse modeling pipeline necessary, or consequently replacing inverse modeling with forward modeling. Entropy-based and Bayesian methods perform best on STIX data. The more complex imaging algorithms utilizing multiple regularization terms (recently proposed for VLBI) add little to no additional improvements for STIX, but outperform the other methods on EHT data. This work demonstrates the great synergy between the STIX and VLBI imaging efforts and the great potential for common developments.Comment: accepted for publication in A&

    Direct 3D Tomographic Reconstruction and Phase-Retrieval of Far-Field Coherent Diffraction Patterns

    Get PDF
    We present an alternative numerical reconstruction algorithm for direct tomographic reconstruction of a sample refractive indices from the measured intensities of its far-field coherent diffraction patterns. We formulate the well-known phase-retrieval problem in ptychography in a tomographic framework which allows for simultaneous reconstruction of the illumination function and the sample refractive indices in three dimensions. Our iterative reconstruction algorithm is based on the Levenberg-Marquardt algorithm. We demonstrate the performance of our proposed method with simulation studies

    High Quality 3D Shape Reconstruction via Digital Refocusing and Pupil Apodization in Multi-wavelength Holographic Interferometry.

    Full text link
    Multi-wavelength holographic interferometry (MWHI) has good potential for evolving into a high quality 3D shape reconstruction technique. There are several remaining challenges, including 1) depth-of-field limitation, leading to axial dimension inaccuracy of out-of-focus objects; and 2) smearing from shiny smooth objects to their dark dull neighbors, generating fake measurements within the dark area. This research is motivated by the goal of developing an advanced optical metrology system that provides accurate 3D profiles for target object or objects of axial dimension larger than the depth-of-field, and for objects with dramatically different surface conditions. The idea of employing digital refocusing in MWHI has been proposed as a solution to the depth-of-field limitation. One the one hand, traditional single wavelength refocusing formula is revised to reduce sensitivity to wavelength error. Investigation over real example demonstrates promising accuracy and repeatability of reconstructed 3D profiles. On the other hand, a phase contrast based focus detection criterion is developed especially for MWHI, which overcomes the problem of phase unwrapping. The combination for these two innovations gives birth to a systematic strategy of acquiring high quality 3D profiles. Following the first phase contrast based focus detection step, interferometric distance measurement by MWHI is implemented as a next step to conduct relative focus detection with high accuracy. This strategy results in ±100mm 3D profile with micron level axial accuracy, which is not available in traditional extended focus image (EFI) solutions. Pupil apodization has been implemented to address the second challenge of smearing. The process of reflective rough surface inspection has been mathematically modeled, which explains the origin of stray light and the necessity of replacing hard-edged pupil with one of gradually attenuating transmission (apodization). Metrics to optimize pupil types and parameters have been chosen especially for MWHI. A Gaussian apodized pupil has been installed and tested. A reduction of smearing in measurement result has been experimentally demonstrated.Ph.D.Mechanical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/91461/1/xulium_1.pd

    Advanced VLBI Imaging

    Get PDF
    Very Long Baseline Interferometry (VLBI) is an observational technique developed in astronomy for combining multiple radio telescopes into a single virtual instrument with an effective aperture reaching up to many thousand kilometers and enabling measurements at highest angular resolutions. The celebrated examples of applying VLBI to astrophysical studies include detailed, high-resolution images of the innermost parts of relativistic outflows (jets) in active galactic nuclei (AGN) and recent pioneering observations of the shadows of supermassive black holes (SMBH) in the center of our Galaxy and in the galaxy M87. Despite these and many other proven successes of VLBI, analysis and imaging of VLBI data still remain difficult, owing in part to the fact that VLBI imaging inherently constitutes an ill-posed inverse problem. Historically, this problem has been addressed in radio interferometry by the CLEAN algorithm, a matching-pursuit inverse modeling method developed in the early 1970-s and since then established as a de-facto standard approach for imaging VLBI data. In recent years, the constantly increasing demand for improving quality and fidelity of interferometric image reconstruction has resulted in several attempts to employ new approaches, such as forward modeling and Bayesian estimation, for application to VLBI imaging. While the current state-of-the-art forward modeling and Bayesian techniques may outperform CLEAN in terms of accuracy, resolution, robustness, and adaptability, they also tend to require more complex structure and longer computation times, and rely on extensive finetuning of a larger number of non-trivial hyperparameters. This leaves an ample room for further searches for potentially more effective imaging approaches and provides the main motivation for this dissertation and its particular focusing on the need to unify algorithmic frameworks and to study VLBI imaging from the perspective of inverse problems in general. In pursuit of this goal, and based on an extensive qualitative comparison of the existing methods, this dissertation comprises the development, testing, and first implementations of two novel concepts for improved interferometric image reconstruction. The concepts combine the known benefits of current forward modeling techniques, develop more automatic and less supervised algorithms for image reconstruction, and realize them within two different frameworks. The first framework unites multiscale imaging algorithms in the spirit of compressive sensing with a dictionary adapted to the uv-coverage and its defects (DoG-HiT, DoB-CLEAN). We extend this approach to dynamical imaging and polarimetric imaging. The core components of this framework are realized in a multidisciplinary and multipurpose software MrBeam, developed as part of this dissertation. The second framework employs a multiobjective genetic evolutionary algorithm (MOEA/D) for the purpose of achieving fully unsupervised image reconstruction and hyperparameter optimization. These new methods are shown to outperform the existing methods in various metrics such as angular resolution, structural sensitivity, and degree of supervision. We demonstrate the great potential of these new techniques with selected applications to frontline VLBI observations of AGN jets and SMBH. In addition to improving the quality and robustness of image reconstruction, DoG-HiT, DoB-CLEAN and MOEA/D also provide such novel capabilities as dynamic reconstruction of polarimetric images on minute time-scales, or near-real time and unsupervised data analysis (useful in particular for application to large imaging surveys). The techniques and software developed in this dissertation are of interest for a wider range of inverse problems as well. This includes such versatile fields such as Ly-alpha tomography (where we improve estimates of the thermal state of the intergalactic medium), the cosmographic search for dark matter (where we improve forecasted bounds on ultralight dilatons), medical imaging, and solar spectroscopy

    Advanced sparse optimization algorithms for interferometric imaging inverse problems in astronomy

    Get PDF
    In the quest to produce images of the sky at unprecedented resolution with high sensitivity, new generation of astronomical interferometers have been designed. To meet the sensing capabilities of these instruments, techniques aiming to recover the sought images from the incompletely sampled Fourier domain measurements need to be reinvented. This goes hand-in-hand with the necessity to calibrate the measurement modulating unknown effects, which adversely affect the image quality, limiting its dynamic range. The contribution of this thesis consists in the development of advanced optimization techniques tailored to address these issues, ranging from radio interferometry (RI) to optical interferometry (OI). In the context of RI, we propose a novel convex optimization approach for full polarization imaging relying on sparsity-promoting regularizations. Unlike standard RI imaging algorithms, our method jointly solves for the Stokes images by enforcing the polarization constraint, which imposes a physical dependency between the images. These priors are shown to enhance the imaging quality via various performed numerical studies. The proposed imaging approach also benefits from its scalability to handle the huge amounts of data expected from the new instruments. When it comes to deal with the critical and challenging issues of the direction-dependent effects calibration, we further propose a non-convex optimization technique that unifies calibration and imaging steps in a global framework, in which we adapt the earlier developed imaging method for the imaging step. In contrast to existing RI calibration modalities, our method benefits from well-established convergence guarantees even in the non-convex setting considered in this work and its efficiency is demonstrated through several numerical experiments. Last but not least, inspired by the performance of these methodologies and drawing ideas from them, we aim to solve image recovery problem in OI that poses its own set of challenges primarily due to the partial loss of phase information. To this end, we propose a sparsity regularized non-convex optimization algorithm that is equipped with convergence guarantees and is adaptable to both monochromatic and hyperspectral OI imaging. We validate it by presenting the simulation results
    corecore