467 research outputs found

    Convolutional Deblurring for Natural Imaging

    Full text link
    In this paper, we propose a novel design of image deblurring in the form of one-shot convolution filtering that can directly convolve with naturally blurred images for restoration. The problem of optical blurring is a common disadvantage to many imaging applications that suffer from optical imperfections. Despite numerous deconvolution methods that blindly estimate blurring in either inclusive or exclusive forms, they are practically challenging due to high computational cost and low image reconstruction quality. Both conditions of high accuracy and high speed are prerequisites for high-throughput imaging platforms in digital archiving. In such platforms, deblurring is required after image acquisition before being stored, previewed, or processed for high-level interpretation. Therefore, on-the-fly correction of such images is important to avoid possible time delays, mitigate computational expenses, and increase image perception quality. We bridge this gap by synthesizing a deconvolution kernel as a linear combination of Finite Impulse Response (FIR) even-derivative filters that can be directly convolved with blurry input images to boost the frequency fall-off of the Point Spread Function (PSF) associated with the optical blur. We employ a Gaussian low-pass filter to decouple the image denoising problem for image edge deblurring. Furthermore, we propose a blind approach to estimate the PSF statistics for two Gaussian and Laplacian models that are common in many imaging pipelines. Thorough experiments are designed to test and validate the efficiency of the proposed method using 2054 naturally blurred images across six imaging applications and seven state-of-the-art deconvolution methods.Comment: 15 pages, for publication in IEEE Transaction Image Processin

    An Examination of Some Signi cant Approaches to Statistical Deconvolution

    No full text
    We examine statistical approaches to two significant areas of deconvolution - Blind Deconvolution (BD) and Robust Deconvolution (RD) for stochastic stationary signals. For BD, we review some major classical and new methods in a unified framework of nonGaussian signals. The first class of algorithms we look at falls into the class of Minimum Entropy Deconvolution (MED) algorithms. We discuss the similarities between them despite differences in origins and motivations. We give new theoretical results concerning the behaviour and generality of these algorithms and give evidence of scenarios where they may fail. In some cases, we present new modifications to the algorithms to overcome these shortfalls. Following our discussion on the MED algorithms, we next look at a recently proposed BD algorithm based on the correntropy function, a function defined as a combination of the autocorrelation and the entropy functiosn. We examine its BD performance when compared with MED algorithms. We find that the BD carried out via correntropy-matching cannot be straightforwardly interpreted as simultaneous moment-matching due to the breakdown of the correntropy expansion in terms of moments. Other issues such as maximum/minimum phase ambiguity and computational complexity suggest that careful attention is required before establishing the correntropy algorithm as a superior alternative to the existing BD techniques. For the problem of RD, we give a categorisation of different kinds of uncertainties encountered in estimation and discuss techniques required to solve each individual case. Primarily, we tackle the overlooked cases of robustification of deconvolution filters based on estimated blurring response or estimated signal spectrum. We do this by utilising existing methods derived from criteria such as minimax MSE with imposed uncertainty bands and penalised MSE. In particular, we revisit the Modified Wiener Filter (MWF) which offers simplicity and flexibility in giving improved RDs to the standard plug-in Wiener Filter (WF)

    Independent Component Analysis in a convoluted world

    Get PDF

    Convolutive Blind Source Separation Methods

    Get PDF
    In this chapter, we provide an overview of existing algorithms for blind source separation of convolutive audio mixtures. We provide a taxonomy, wherein many of the existing algorithms can be organized, and we present published results from those algorithms that have been applied to real-world audio separation tasks

    Comparative power spectral analysis of simultaneous elecroencephalographic and magnetoencephalographic recordings in humans suggests non-resistive extracellular media

    Get PDF
    The resistive or non-resistive nature of the extracellular space in the brain is still debated, and is an important issue for correctly modeling extracellular potentials. Here, we first show theoretically that if the medium is resistive, the frequency scaling should be the same for electroencephalogram (EEG) and magnetoencephalogram (MEG) signals at low frequencies (<10 Hz). To test this prediction, we analyzed the spectrum of simultaneous EEG and MEG measurements in four human subjects. The frequency scaling of EEG displays coherent variations across the brain, in general between 1/f and 1/f^2, and tends to be smaller in parietal/temporal regions. In a given region, although the variability of the frequency scaling exponent was higher for MEG compared to EEG, both signals consistently scale with a different exponent. In some cases, the scaling was similar, but only when the signal-to-noise ratio of the MEG was low. Several methods of noise correction for environmental and instrumental noise were tested, and they all increased the difference between EEG and MEG scaling. In conclusion, there is a significant difference in frequency scaling between EEG and MEG, which can be explained if the extracellular medium (including other layers such as dura matter and skull) is globally non-resistive.Comment: Submitted to Journal of Computational Neuroscienc

    Restoration of Atmospheric Turbulence Degraded Video using Kurtosis Minimization and Motion Compensation

    Get PDF
    In this thesis work, the background of atmospheric turbulence degradation in imaging was reviewed and two aspects are highlighted: blurring and geometric distortion. The turbulence burring parameter is determined by the atmospheric turbulence condition that is often unknown; therefore, a blur identification technique was developed that is based on a higher order statistics (HOS). It was observed that the kurtosis generally increases as an image becomes blurred (smoothed). Such an observation was interpreted in the frequency domain in terms of phase correlation. Kurtosis minimization based blur identification is built upon this observation. It was shown that kurtosis minimization is effective in identifying the blurring parameter directly from the degraded image. Kurtosis minimization is a general method for blur identification. It has been tested on a variety of blurs such as Gaussian blur, out of focus blur as well as motion blur. To compensate for the geometric distortion, earlier work on the turbulent motion compensation was extended to deal with situations in which there is camera/object motion. Trajectory smoothing is used to suppress the turbulent motion while preserving the real motion. Though the scintillation effect of atmospheric turbulence is not considered separately, it can be handled the same way as multiple frame denoising while motion trajectories are built.Ph.D.Committee Chair: Mersereau, Russell; Committee Co-Chair: Smith, Mark; Committee Member: Lanterman, Aaron; Committee Member: Wang, May; Committee Member: Tannenbaum, Allen; Committee Member: Williams, Dougla

    Computational Inverse Problems

    Get PDF
    Inverse problem typically deal with the identification of unknown quantities from indirect measurements and appear in many areas in technology, medicine, biology, finance, and econometrics. The computational solution of such problems is a very active, interdisciplinary field with close connections to optimization, control theory, differential equations, asymptotic analysis, statistics, and probability. The focus of this workshop was on hybrid methods, model reduction, regularization in Banach spaces, and statistical approaches
    • …
    corecore