36 research outputs found

    Method and system for enabling real-time speckle processing using hardware platforms

    Get PDF
    An accelerator for the speckle atmospheric compensation algorithm may enable real-time speckle processing of video feeds that may enable the speckle algorithm to be applied in numerous real-time applications. The accelerator may be implemented in various forms, including hardware, software, and/or machine-readable media

    Robustness of speckle imaging techniques applied to horizontal imaging scenarios

    Get PDF
    Atmospheric turbulence near the ground severely limits the quality of imagery acquired over long horizontal paths. In defense, surveillance, and border security applications, there is interest in deploying man-portable, embedded systems incorporating image reconstruction methods to compensate turbulence effects. While many image reconstruction methods have been proposed, their suitability for use in man-portable embedded systems is uncertain. To be effective, these systems must operate over significant variations in turbulence conditions while subject to other variations due to operation by novice users. Systems that meet these requirements and are otherwise designed to be immune to the factors that cause variation in performance are considered robust. In addition robustness in design, the portable nature of these systems implies a preference for systems with a minimum level of computational complexity. Speckle imaging methods have recently been proposed as being well suited for use in man-portable horizontal imagers. In this work, the robustness of speckle imaging methods is established by identifying a subset of design parameters that provide immunity to the expected variations in operating conditions while minimizing the computation time necessary for image recovery. Design parameters are selected by parametric evaluation of system performance as factors external to the system are varied. The precise control necessary for such an evaluation is made possible using image sets of turbulence degraded imagery developed using a novel technique for simulating anisoplanatic image formation over long horizontal paths. System performance is statistically evaluated over multiple reconstruction using the Mean Squared Error (MSE) to evaluate reconstruction quality. In addition to more general design parameters, the relative performance the bispectrum and the Knox-Thompson phase recovery methods is also compared. As an outcome of this work it can be concluded that speckle-imaging techniques are robust to the variation in turbulence conditions and user controlled parameters expected when operating during the day over long horizontal paths. Speckle imaging systems that incorporate 15 or more image frames and 4 estimates of the object phase per reconstruction provide up to 45% reduction in MSE and 68% reduction in the deviation. In addition, Knox-Thompson phase recover method is shown to produce images in half the time required by the bispectrum. The quality of images reconstructed using Knox-Thompson and bispectrum methods are also found to be nearly identical. Finally, it is shown that certain blind image quality metrics can be used in place of the MSE to evaluate quality in field scenarios. Using blind metrics rather depending on user estimates allows for reconstruction quality that differs from the minimum MSE by as little as 1%, significantly reducing the deviation in performance due to user action

    Application of MFBD Algorithms to Image Reconstruction Under Anisoplanatic Conditions

    Get PDF
    All optical systems that operate in or through the atmosphere suffer from turbulence induced image blur. Both military and civilian surveillance, gun-sighting, and target identification systems are interested in terrestrial imaging over very long horizontal paths, but atmospheric turbulence can blur the resulting images beyond usefulness. My dissertation explores the performance of a multi-frame-blind-deconvolution technique applied under anisoplanatic conditions for both Gaussian and Poisson noise model assumptions. The technique is evaluated for use in reconstructing images of scenes corrupted by turbulence in long horizontal-path imaging scenarios and compared to other speckle imaging techniques. Performance is evaluated via the reconstruction of a common object from three sets of simulated turbulence degraded imagery representing low, moderate and severe turbulence conditions. Each set consisted of 1000 simulated, turbulence degraded images. The MSE performance of the estimator is evaluated as a function of the number of images, and the number of Zernike polynomial terms used to characterize the point spread function. I will compare the mean-square-error (MSE) performance of speckle imaging methods and a maximum-likelihood, multi-frame blind deconvolution (MFBD) method applied to long-path horizontal imaging scenarios. Both methods are used to reconstruct a scene from simulated imagery featuring anisoplanatic turbulence induced aberrations. This comparison is performed over three sets of 1000 simulated images each for low, moderate and severe turbulence-induced image degradation. The comparison shows that speckle-imaging techniques reduce the MSE 46 percent, 42 percent and 47 percent on average for low, moderate, and severe cases, respectively using 15 input frames under daytime conditions and moderate frame rates. Similarly, the MFBD method provides, 40 percent, 29 percent, and 36 percent improvements in MSE on average under the same conditions. The comparison is repeated under low light conditions (less than 100 photons per pixel) where improvements of 39 percent, 29 percent and 27 percent are available using speckle imaging methods and 25 input frames and 38 percent, 34 percent and 33 percent respectively for the MFBD method and 150 input frames. The MFBD estimator is applied to three sets of field data and the results presented. Finally, a combined Bispectrum-MFBD Hybrid estimator is proposed and investigated. This technique consistently provides a lower MSE and smaller variance in the estimate under all three simulated turbulence conditions

    Restoration of Atmospheric Turbulence Degraded Video using Kurtosis Minimization and Motion Compensation

    Get PDF
    In this thesis work, the background of atmospheric turbulence degradation in imaging was reviewed and two aspects are highlighted: blurring and geometric distortion. The turbulence burring parameter is determined by the atmospheric turbulence condition that is often unknown; therefore, a blur identification technique was developed that is based on a higher order statistics (HOS). It was observed that the kurtosis generally increases as an image becomes blurred (smoothed). Such an observation was interpreted in the frequency domain in terms of phase correlation. Kurtosis minimization based blur identification is built upon this observation. It was shown that kurtosis minimization is effective in identifying the blurring parameter directly from the degraded image. Kurtosis minimization is a general method for blur identification. It has been tested on a variety of blurs such as Gaussian blur, out of focus blur as well as motion blur. To compensate for the geometric distortion, earlier work on the turbulent motion compensation was extended to deal with situations in which there is camera/object motion. Trajectory smoothing is used to suppress the turbulent motion while preserving the real motion. Though the scintillation effect of atmospheric turbulence is not considered separately, it can be handled the same way as multiple frame denoising while motion trajectories are built.Ph.D.Committee Chair: Mersereau, Russell; Committee Co-Chair: Smith, Mark; Committee Member: Lanterman, Aaron; Committee Member: Wang, May; Committee Member: Tannenbaum, Allen; Committee Member: Williams, Dougla

    Astronomical Optical Interferometry. I. Methods and Instrumentation

    Get PDF
    Previous decade has seen an achievement of large interferometricprojects including 8-10m telescopes and 100m class baselines. Modern computerand control technology has enabled the interferometric combination of lightfrom separate telescopes also in the visible and infrared regimes. Imagingwith milli-arcsecond (mas) resolution and astrometry with micro-arcsecond(mumuas) precision have thus become reality. Here, I review the methods andinstrumentation corresponding to the current state in the field ofastronomical optical interferometry. First, this review summarizes thedevelopment from the pioneering works of Fizeau and Michelson. Next, thefundamental observables are described, followed by the discussion of the basicdesign principles of modern interferometers. The basic interferometrictechniques such as speckle and aperture masking interferometry, aperture synthesisand nulling interferometry are disscused as well. Using the experience ofpast and existing facilities to illustrate important points, I considerparticularly the new generation of large interferometers that has beenrecently commissioned (most notably, the CHARA, Keck, VLT and LBTInterferometers). Finally, I discuss the longer-term future of opticalinterferometry, including the possibilities of new large-scale ground-based projects and prospects for space interferometry

    Differential Tilt Variance Effects of Turbulence in Imagery: Comparing Simulation with Theory

    Get PDF
    Differential tilt variance is a useful metric for interpreting the distorting effects of turbulence in incoherent imaging systems. In this paper, we compare the theoretical model of differential tilt variance to simulations. Simulation is based on a Monte Carlo wave optics approach with split step propagation. Results show that the simulation closely matches theory. The results also show that care must be taken when selecting a method to estimate tilts

    High-resolution observations of the solar photosphere and chromosphere

    Get PDF
    Observations of the sun are almost always impaired by the turbulent motion of air in Earth's atmosphere. The turbulence would limit the theoretical resolution of modern large telescopes to that of amateur telescopes without additional tools.Today however, high-resolution data of the Sun are necessary to invesitgate its small-scale structure. This structure is likely to be connected to the radially outward increasing temparature distribution of the solar atmosphere. An introduction into further details of this topic that has also been the motivation for this work is presented in Chapt. 1. A theory of atmospheric turbulence that builds the basis for several results of this work is described in Chapt. 2. Here, two modern tools to enhance the resolution of groundbased observations are reviewed, on the one hand adaptive optics (AO) systems and on the other hand speckle interferometry. Until recently, these two techniques were only used separately. In Chapt. 3 the necessary modifications for analytical models of transfer functions are developed that include the changes made by an AO system to the incoming wave front, thus making a combination of AO systems and speckle interferometry possible ...thesi

    Inverse problems in astronomical and general imaging

    Get PDF
    The resolution and the quality of an imaged object are limited by four contributing factors. Firstly, the primary resolution limit of a system is imposed by the aperture of an instrument due to the effects of diffraction. Secondly, the finite sampling frequency, the finite measurement time and the mechanical limitations of the equipment also affect the resolution of the images captured. Thirdly, the images are corrupted by noise, a process inherent to all imaging systems. Finally, a turbulent imaging medium introduces random degradations to the signals before they are measured. In astronomical imaging, it is the atmosphere which distorts the wavefronts of the objects, severely limiting the resolution of the images captured by ground-based telescopes. These four factors affect all real imaging systems to varying degrees. All the limitations imposed on an imaging system result in the need to deduce or reconstruct the underlying object distribution from the distorted measured data. This class of problems is called inverse problems. The key to the success of solving an inverse problem is the correct modelling of the physical processes which give rise to the corresponding forward problem. However, the physical processes have an infinite amount of information, but only a finite number of parameters can be used in the model. Information loss is therefore inevitable. As a result, the solution to many inverse problems requires additional information or prior knowledge. The application of prior information to inverse problems is a recurrent theme throughout this thesis. An inverse problem that has been an active research area for many years is interpolation, and there exist numerous techniques for solving this problem. However, many of these techniques neither account for the sampling process of the instrument nor include prior information in the reconstruction. These factors are taken into account in the proposed optimal Bayesian interpolator. The process of interpolation is also examined from the point of view of superresolution, as these processes can be viewed as being complementary. Since the principal effect of atmospheric turbulence on an incoming wavefront is a phase distortion, most of the inverse problem techniques devised for this seek to either estimate or compensate for this phase component. These techniques are classified into computer post-processing methods, adaptive optics (AO) and hybrid techniques. Blind deconvolution is a post-processing technique which uses the speckle images to estimate both the object distribution and the point spread function (PSF), the latter of which is directly related to the phase. The most successful approaches are based on characterising the PSF as the aberrations over the aperture. Since the PSF is also dependent on the atmosphere, it is possible to constrain the solution using the statistics of the atmosphere. An investigation shows the feasibility of this approach. Bispectrum is also a post-processing method which reconstructs the spectrum of the object. The key component for phase preservation is the property of phase closure, and its application as prior information for blind deconvolution is examined. Blind deconvolution techniques utilise only information in the image channel to estimate the phase which is difficult. An alternative method for phase estimation is from a Shack-Hartmann (SH) wavefront sensing channel. However, since phase information is present in both the wavefront sensing and the image channels simultaneously, both of these approaches suffer from the problem that phase information from only one channel is used. An improved estimate of the phase is achieved by a combination of these methods, ensuring that the phase estimation is made jointly from the data in both the image and the wavefront sensing measurements. This formulation, posed as a blind deconvolution framework, is investigated in this thesis. An additional advantage of this approach is that since speckle images are imaged in a narrowband, while wavefront sensing images are captured by a charge-coupled device (CCD) camera at all wavelengths, the splitting of the light does not compromise the light level for either channel. This provides a further incentive for using simultaneous data sets. The effectiveness of using Shack-Hartmann wavefront sensing data for phase estimation relies on the accuracy of locating the data spots. The commonly used method which calculates the centre of gravity of the image is in fact prone to noise and is suboptimal. An improved method for spot location based on blind deconvolution is demonstrated. Ground-based adaptive optics (AO) technologies aim to correct for atmospheric turbulence in real time. Although much success has been achieved, the space- and time-varying nature of the atmosphere renders the accurate measurement of atmospheric properties difficult. It is therefore usual to perform additional post-processing on the AO data. As a result, some of the techniques developed in this thesis are applicable to adaptive optics. One of the methods which utilise elements of both adaptive optics and post-processing is the hybrid technique of deconvolution from wavefront sensing (DWFS). Here, both the speckle images and the SH wavefront sensing data are used. The original proposal of DWFS is simple to implement but suffers from the problem where the magnitude of the object spectrum cannot be reconstructed accurately. The solution proposed for overcoming this is to use an additional set of reference star measurements. This however does not completely remove the original problem; in addition it introduces other difficulties associated with reference star measurements such as anisoplanatism and reduction of valuable observing time. In this thesis a parameterised solution is examined which removes the need for a reference star, as well as offering a potential to overcome the problem of estimating the magnitude of the object

    Optical interferometry in astronomy

    Full text link
    Here I review the current state of the field of optical stellar interferometry, concentrating on ground-based work although a brief report of space interferometry missions is included. We pause both to reflect on decades of immense progress in the field as well as to prepare for a new generation of large interferometers just now being commissioned (most notably, the CHARA, Keck and VLT Interferometers). First, this review summarizes the basic principles behind stellar interferometry needed by the lay-physicist and general astronomer to understand the scientific potential as well as technical challenges of interferometry. Next, the basic design principles of practical interferometers are discussed, using the experience of past and existing facilities to illustrate important points. Here there is significant discussion of current trends in the field, including the new facilities under construction and advanced technologies being debuted. This decade has seen the influence of stellar interferometry extend beyond classical regimes of stellar diameters and binary orbits to new areas such as mapping the accretion discs around young stars, novel calibration of the cepheid period-luminosity relation, and imaging of stellar surfaces. The third section is devoted to the major scientific results from interferometry, grouped into natural categories reflecting these current developments. Lastly, I consider the future of interferometry, highlighting the kinds of new science promised by the interferometers coming on-line in the next few years. I also discuss the longer-term future of optical interferometry, including the prospects for space interferometry and the possibilities of large-scale ground-based projects. Critical technological developments are still needed to make these projects attractive and affordable.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/48845/2/r30503.pd
    corecore