4,659 research outputs found

    Minimum entropy restoration using FPGAs and high-level techniques

    Get PDF
    One of the greatest perceived barriers to the widespread use of FPGAs in image processing is the difficulty for application specialists of developing algorithms on reconfigurable hardware. Minimum entropy deconvolution (MED) techniques have been shown to be effective in the restoration of star-field images. This paper reports on an attempt to implement a MED algorithm using simulated annealing, first on a microprocessor, then on an FPGA. The FPGA implementation uses DIME-C, a C-to-gates compiler, coupled with a low-level core library to simplify the design task. Analysis of the C code and output from the DIME-C compiler guided the code optimisation. The paper reports on the design effort that this entailed and the resultant performance improvements

    Restoration of star-field images using high-level languages and core libraries

    Get PDF
    Research into the use of FPGAs in Image Processing began in earnest at the beginning of the 1990s. Since then, many thousands of publications have pointed to the computational capabilities of FPGAs. During this time, FPGAs have seen the application space to which they are applicable grow in tandem with their logic densities. When investigating a particular application, researchers compare FPGAs with alternative technologies such as Digital Signal Processors (DSPs), Application-Specific Integrated Cir-cuits (ASICs), microprocessors and vector processors. The metrics for comparison depend on the needs of the application, and include such measurements as: raw performance, power consumption, unit cost, board footprint, non-recurring engineering cost, design time and design cost. The key metrics for a par-ticular application may also include ratios of these metrics, e.g. power/performance, or performance/unit cost. The work detailed in this paper compares a 90nm-process commodity microprocessor with a plat-form based around a 90nm-process FPGA, focussing on design time and raw performance. The application chosen for implementation was a minimum entropy restoration of star-field images (see [1] for an introduction), with simulated annealing used to converge towards the globally-optimum solution. This application was not chosen in the belief that it would particularly suit one technology over another, but was instead selected as being representative of a computationally intense image-processing application

    Deconvolution with correct sampling

    Get PDF
    A new method for improving the resolution of astronomical images is presented. It is based on the principle that sampled data cannot be fully deconvolved without violating the sampling theorem. Thus, the sampled image should not be deconvolved by the total Point Spread Function, but by a narrower function chosen so that the resolution of the deconvolved image is compatible with the adopted sampling. Our deconvolution method gives results which are, in at least some cases, superior to those of other commonly used techniques: in particular, it does not produce ringing around point sources superimposed on a smooth background. Moreover, it allows to perform accurate astrometry and photometry of crowded fields. These improvements are a consequence of both the correct treatment of sampling and the recognition that the most probable astronomical image is not a flat one. The method is also well adapted to the optimal combination of different images of the same object, as can be obtained, e.g., from infrared observations or via adaptive optics techniques.Comment: 22 pages, LaTex file + 10 color jpg and postscript figures. To be published in ApJ, Vol 484 (1997 Feb.

    Weak Lensing Mass Reconstruction using Wavelets

    Full text link
    This paper presents a new method for the reconstruction of weak lensing mass maps. It uses the multiscale entropy concept, which is based on wavelets, and the False Discovery Rate which allows us to derive robust detection levels in wavelet space. We show that this new restoration approach outperforms several standard techniques currently used for weak shear mass reconstruction. This method can also be used to separate E and B modes in the shear field, and thus test for the presence of residual systematic effects. We concentrate on large blind cosmic shear surveys, and illustrate our results using simulated shear maps derived from N-Body Lambda-CDM simulations with added noise corresponding to both ground-based and space-based observations.Comment: Accepted manuscript with all figures can be downloaded at: http://jstarck.free.fr/aa_wlens05.pdf and software can be downloaded at http://jstarck.free.fr/mrlens.htm

    Why Chromatic Imaging Matters

    Full text link
    During the last two decades, the first generation of beam combiners at the Very Large Telescope Interferometer has proved the importance of optical interferometry for high-angular resolution astrophysical studies in the near- and mid-infrared. With the advent of 4-beam combiners at the VLTI, the u-v coverage per pointing increases significantly, providing an opportunity to use reconstructed images as powerful scientific tools. Therefore, interferometric imaging is already a key feature of the new generation of VLTI instruments, as well as for other interferometric facilities like CHARA and JWST. It is thus imperative to account for the current image reconstruction capabilities and their expected evolutions in the coming years. Here, we present a general overview of the current situation of optical interferometric image reconstruction with a focus on new wavelength-dependent information, highlighting its main advantages and limitations. As an Appendix we include several cookbooks describing the usage and installation of several state-of-the art image reconstruction packages. To illustrate the current capabilities of the software available to the community, we recovered chromatic images, from simulated MATISSE data, using the MCMC software SQUEEZE. With these images, we aim at showing the importance of selecting good regularization functions and their impact on the reconstruction.Comment: Accepted for publication in Experimental Astronomy as part of the topical collection: Future of Optical-infrared Interferometry in Europ

    Morphological analysis of the cm-wave continuum in the dark cloud LDN1622

    Full text link
    The spectral energy distribution of the dark cloud LDN1622, as measured by Finkbeiner using WMAP data, drops above 30GHz and is suggestive of a Boltzmann cutoff in grain rotation frequencies, characteristic of spinning dust emission. LDN1622 is conspicuous in the 31 GHz image we obtained with the Cosmic Background Imager, which is the first cm-wave resolved image of a dark cloud. The 31GHz emission follows the emission traced by the four IRAS bands. The normalised cross-correlation of the 31 GHz image with the IRAS images is higher by 6.6sigma for the 12um and 25um bands than for the 60um and 100um bands: C(12+25) = 0.76+/-0.02 and C(60+100) = 0.64+/-0.01. The mid-IR -- cm-wave correlation in LDN 1622 is evidence for very small grain (VSG) or continuum emission at 26-36GHz from a hot molecular phase. In dark clouds and their photon-dominated regions (PDRs) the 12um and 25um emission is attributed to stochastic heating of the VSGs. The mid-IR and cm-wave dust emissions arise in a limb-brightened shell coincident with the PDR of LDN1622, where the incident UV radiation from the Ori OB1b association heats and charges the grains, as required for spinning dust.Comment: accepted for publication in ApJ - the complete article with uncompressed figures may be downloaded from http://www.das.uchile.cl/~simon/ftp/l1622.pd

    Atmospheric PSF Interpolation for Weak Lensing in Short Exposure Imaging Data

    Full text link
    A main science goal for the Large Synoptic Survey Telescope (LSST) is to measure the cosmic shear signal from weak lensing to extreme accuracy. One difficulty, however, is that with the short exposure time (≃\simeq15 seconds) proposed, the spatial variation of the Point Spread Function (PSF) shapes may be dominated by the atmosphere, in addition to optics errors. While optics errors mainly cause the PSF to vary on angular scales similar or larger than a single CCD sensor, the atmosphere generates stochastic structures on a wide range of angular scales. It thus becomes a challenge to infer the multi-scale, complex atmospheric PSF patterns by interpolating the sparsely sampled stars in the field. In this paper we present a new method, PSFent, for interpolating the PSF shape parameters, based on reconstructing underlying shape parameter maps with a multi-scale maximum entropy algorithm. We demonstrate, using images from the LSST Photon Simulator, the performance of our approach relative to a 5th-order polynomial fit (representing the current standard) and a simple boxcar smoothing technique. Quantitatively, PSFent predicts more accurate PSF models in all scenarios and the residual PSF errors are spatially less correlated. This improvement in PSF interpolation leads to a factor of 3.5 lower systematic errors in the shear power spectrum on scales smaller than ∌13â€Č\sim13', compared to polynomial fitting. We estimate that with PSFent and for stellar densities greater than ≃1/arcmin2\simeq1/{\rm arcmin}^{2}, the spurious shear correlation from PSF interpolation, after combining a complete 10-year dataset from LSST, is lower than the corresponding statistical uncertainties on the cosmic shear power spectrum, even under a conservative scenario.Comment: 18 pages,12 figures, accepted by MNRA

    Improvement of Spatial Resolution with Staggered Arrays As Used in The Airborne Optical Sensor Ads40

    Get PDF
    Using pushbroom sensors onboard aircrafts or satellites requires, especially for photogrammetric applications, wide image swaths with a high geometric resolution. One approach to satisfy both demands is to use staggered line arrays, which are constructed from two identical CCD lines shifted against each other by half a picel in line direction. Practical applications of such arrays in remote sensing include SPOT, and in the commercial environment the Airborne Digital Sensor, or ADS40, from Leica Geosystems. Theoretically, the usefulness of staggered arrays depends from spatial reslution, which is defined by the total point spread function of the imaging system and Shannon's sampling theorem. Due to the two shifted sensor lines staggering results in a double number of sampling points perpendicular to the flight direction. In order to simultaneously double the sample number in the flight direction, the line readout rate, or integration time, has to produce half a pixel spacing on ground. Staggering in combination with a high-resolution optical system can be used to fulfil the sampling condition, which means that no spectral components above the critical spatial frequency 2/D are present. Theoretically, the resolution is as good for a non-staggered line with half pixel size D/2, but radiometric dynamics should be twice as high. In practice, the slightly different viewing angle of both lines of a staggered array can result in a deteration of image quality due to aircraft motion, attitude fluctuations or terrain undulation. Fulfilling the sampling condition further means that no aliasing occurs. This is essential for the image quality in quasiperiodical textured image areas and for photogrammetric sub-pixel accuracy. Furthermore, image restoration methods for enhancing the image quality can be applied more efficently. The panchromatic resolution of the ADS40 opties is optimised for image collection by a staggered array. This means, it transfers spatial frequencies of twice the Nyquist frequency of its 12k sensors. First experiments, which were carried out some years ago, indicated alrady a spatial resolution improvement by using image restitution the ADS 40 staggered 12k pairs. The results of the restitution algorithm, which is integrated in the ADS image processing flow, has now been analysed quantitatively. This paper presents the theory of high resolution image restitution from staggered lines and practical results with ADS40 high resolution panchromatic images and high resolution colour images, created by sharpening 12k colour images with high resolution pan-chromatic ones
    • 

    corecore