7,209 research outputs found

    CHARA/MIRC observations of two M supergiants in Perseus OB1: temperature, Bayesian modeling, and compressed sensing imaging

    Get PDF
    Two red supergiants of the Per OB1 association, RS Per and T Per, have been observed in H band using the MIRC instrument at the CHARA array. The data show clear evidence of departure from circular symmetry. We present here new techniques specially developed to analyze such cases, based on state-of-the-art statistical frameworks. The stellar surfaces are first modeled as limb-darkened discs based on SATLAS models that fit both MIRC interferometric data and publicly available spectrophotometric data. Bayesian model selection is then used to determine the most probable number of spots. The effective surface temperatures are also determined and give further support to the recently derived hotter temperature scales of red su- pergiants. The stellar surfaces are reconstructed by our model-independent imaging code SQUEEZE, making use of its novel regularizer based on Compressed Sensing theory. We find excellent agreement between the model-selection results and the reconstructions. Our results provide evidence for the presence of near-infrared spots representing about 3-5% of the stellar flux

    Bayesian Image Quality Transfer with CNNs: Exploring Uncertainty in dMRI Super-Resolution

    Get PDF
    In this work, we investigate the value of uncertainty modeling in 3D super-resolution with convolutional neural networks (CNNs). Deep learning has shown success in a plethora of medical image transformation problems, such as super-resolution (SR) and image synthesis. However, the highly ill-posed nature of such problems results in inevitable ambiguity in the learning of networks. We propose to account for intrinsic uncertainty through a per-patch heteroscedastic noise model and for parameter uncertainty through approximate Bayesian inference in the form of variational dropout. We show that the combined benefits of both lead to the state-of-the-art performance SR of diffusion MR brain images in terms of errors compared to ground truth. We further show that the reduced error scores produce tangible benefits in downstream tractography. In addition, the probabilistic nature of the methods naturally confers a mechanism to quantify uncertainty over the super-resolved output. We demonstrate through experiments on both healthy and pathological brains the potential utility of such an uncertainty measure in the risk assessment of the super-resolved images for subsequent clinical use.Comment: Accepted paper at MICCAI 201

    Image reconstruction in optical interferometry: Benchmarking the regularization

    Full text link
    With the advent of infrared long-baseline interferometers with more than two telescopes, both the size and the completeness of interferometric data sets have significantly increased, allowing images based on models with no a priori assumptions to be reconstructed. Our main objective is to analyze the multiple parameters of the image reconstruction process with particular attention to the regularization term and the study of their behavior in different situations. The secondary goal is to derive practical rules for the users. Using the Multi-aperture image Reconstruction Algorithm (MiRA), we performed multiple systematic tests, analyzing 11 regularization terms commonly used. The tests are made on different astrophysical objects, different (u,v) plane coverages and several signal-to-noise ratios to determine the minimal configuration needed to reconstruct an image. We establish a methodology and we introduce the mean-square errors (MSE) to discuss the results. From the ~24000 simulations performed for the benchmarking of image reconstruction with MiRA, we are able to classify the different regularizations in the context of the observations. We find typical values of the regularization weight. A minimal (u,v) coverage is required to reconstruct an acceptable image, whereas no limits are found for the studied values of the signal-to-noise ratio. We also show that super-resolution can be achieved with increasing performance with the (u,v) coverage filling. Using image reconstruction with a sufficient (u,v) coverage is shown to be reliable. The choice of the main parameters of the reconstruction is tightly constrained. We recommend that efforts to develop interferometric infrastructures should first concentrate on the number of telescopes to combine, and secondly on improving the accuracy and sensitivity of the arrays.Comment: 15 pages, 16 figures; accepted in A&

    The 2010 Interferometric Imaging Beauty Contest

    Full text link
    We present the results of the fourth Optical/IR Interferometry Imaging Beauty Contest. The contest consists of blind imaging of test data sets derived from model sources and distributed in the OI-FITS format. The test data consists of spectral data sets on an object "observed" in the infrared with spectral resolution. There were 4 different algorithms competing this time: BSMEM the Bispectrum Maximum Entropy Method by Young, Baron & Buscher; RPR the Recursive Phase Reconstruction by Rengaswamy; SQUEEZE a Markov Chain Monte Carlo algorithm by Baron, Monnier & Kloppenborg; and, WISARD the Weak-phase Interferometric Sample Alternating Reconstruction Device by Vannier & Mugnier. The contest model image, the data delivered to the contestants and the rules are described as well as the results of the image reconstruction obtained by each method. These results are discussed as well as the strengths and limitations of each algorithm.Comment: To be published in SPIE 2010 "Optical and infrared interferometry II

    Quantum-inspired computational imaging

    Get PDF
    Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip

    Joint-SRVDNet: Joint Super Resolution and Vehicle Detection Network

    Get PDF
    In many domestic and military applications, aerial vehicle detection and super-resolutionalgorithms are frequently developed and applied independently. However, aerial vehicle detection on super-resolved images remains a challenging task due to the lack of discriminative information in the super-resolved images. To address this problem, we propose a Joint Super-Resolution and Vehicle DetectionNetwork (Joint-SRVDNet) that tries to generate discriminative, high-resolution images of vehicles fromlow-resolution aerial images. First, aerial images are up-scaled by a factor of 4x using a Multi-scaleGenerative Adversarial Network (MsGAN), which has multiple intermediate outputs with increasingresolutions. Second, a detector is trained on super-resolved images that are upscaled by factor 4x usingMsGAN architecture and finally, the detection loss is minimized jointly with the super-resolution loss toencourage the target detector to be sensitive to the subsequent super-resolution training. The network jointlylearns hierarchical and discriminative features of targets and produces optimal super-resolution results. Weperform both quantitative and qualitative evaluation of our proposed network on VEDAI, xView and DOTAdatasets. The experimental results show that our proposed framework achieves better visual quality than thestate-of-the-art methods for aerial super-resolution with 4x up-scaling factor and improves the accuracy ofaerial vehicle detection

    Super-resolving multiresolution images with band-independant geometry of multispectral pixels

    Get PDF
    A new resolution enhancement method is presented for multispectral and multi-resolution images, such as these provided by the Sentinel-2 satellites. Starting from the highest resolution bands, band-dependent information (reflectance) is separated from information that is common to all bands (geometry of scene elements). This model is then applied to unmix low-resolution bands, preserving their reflectance, while propagating band-independent information to preserve the sub-pixel details. A reference implementation is provided, with an application example for super-resolving Sentinel-2 data.Comment: Source code with a ready-to-use script for super-resolving Sentinel-2 data is available at http://nicolas.brodu.net/recherche/superres
    corecore