5,268 research outputs found

    BLADE: Filter Learning for General Purpose Computational Photography

    Full text link
    The Rapid and Accurate Image Super Resolution (RAISR) method of Romano, Isidoro, and Milanfar is a computationally efficient image upscaling method using a trained set of filters. We describe a generalization of RAISR, which we name Best Linear Adaptive Enhancement (BLADE). This approach is a trainable edge-adaptive filtering framework that is general, simple, computationally efficient, and useful for a wide range of problems in computational photography. We show applications to operations which may appear in a camera pipeline including denoising, demosaicing, and stylization

    Parameter optimization for local polynomial approximation based intersection confidence interval filter using genetic algorithm: an application for brain MRI image de-noising

    Get PDF
    Magnetic resonance imaging (MRI) is extensively exploited for more accuratepathological changes as well as diagnosis. Conversely, MRI suffers from variousshortcomings such as ambient noise from the environment, acquisition noise from theequipment, the presence of background tissue, breathing motion, body fat, etc.Consequently, noise reduction is critical as diverse types of the generated noise limit the efficiency of the medical image diagnosis. Local polynomial approximation basedintersection confidence interval (LPA-ICI) filter is one of the effective de-noising filters.This filter requires an adjustment of the ICI parameters for efficient window size selection.From the wide range of ICI parametric values, finding out the best set of tunes values is itselfan optimization problem. The present study proposed a novel technique for parameteroptimization of LPA-ICI filter using genetic algorithm (GA) for brain MR imagesde-noising. The experimental results proved that the proposed method outperforms theLPA-ICI method for de-noising in terms of various performance metrics for different noisevariance levels. Obtained results reports that the ICI parameter values depend on the noisevariance and the concerned under test image

    Plausible fluorescent Ly-alpha emitters around the z=3.1 QSO0420-388

    Full text link
    We report the results of a survey for fluorescent Ly-alpha emission carried out in the field surrounding the z=3.1 quasar QSO0420-388 using the FORS2 instrument on the VLT. We first review the properties expected for fluorescent Ly-alpha emitters, compared with those of other non-fluorescent Ly-alpha emitters. Our observational search detected 13 Ly-alpha sources sparsely sampling a volume of ~14000 comoving Mpc^3 around the quasar. The properties of these in terms of i) the line equivalent width, ii) the line profile and iii) the value of the surface brightness related to the distance from the quasar, all suggest that several of these may be plausibly fluorescent. Moreover, their number is in good agreement with the expectation from theoretical models. One of the best candidates for fluorescence is sufficiently far behind QSO0420-388 that it would imply that the quasar has been active for (at least) ~60 Myrs. Further studies on such objects will give information about proto-galactic clouds and on the radiative history (and beaming) of the high-redshift quasars.Comment: 10 pages, 4 figures.Update to match the version published on ApJ 657, 135, 2007 March

    Gravitational wave radiometry: Mapping a stochastic gravitational wave background

    Get PDF
    The problem of the detection and mapping of a stochastic gravitational wave background (SGWB), either of cosmological or astrophysical origin, bears a strong semblance to the analysis of CMB anisotropy and polarization. The basic statistic we use is the cross-correlation between the data from a pair of detectors. In order to `point' the pair of detectors at different locations one must suitably delay the signal by the amount it takes for the gravitational waves (GW) to travel to both detectors corresponding to a source direction. Then the raw (observed) sky map of the SGWB is the signal convolved with a beam response function that varies with location in the sky. We first present a thorough analytic understanding of the structure of the beam response function using an analytic approach employing the stationary phase approximation. The true sky map is obtained by numerically deconvolving the beam function in the integral (convolution) equation. We adopt the maximum likelihood framework to estimate the true sky map that has been successfully used in the broadly similar, well-studied CMB map making problem. We numerically implement and demonstrate the method on simulated (unpolarized) SGWB for the radiometer consisting of the LIGO pair of detectors at Hanford and Livingston. We include `realistic' additive Gaussian noise in each data stream based on the LIGO-I noise power spectral density. The extension of the method to multiple baselines and polarized GWB is outlined. In the near future the network of GW detectors, including the Advanced LIGO and Virgo detectors that will be sensitive to sources within a thousand times larger spatial volume, could provide promising data sets for GW radiometry.Comment: 24 pages, 19 figures, pdflatex. Matched version published in Phys. Rev. D - minor change

    Error propagation in pattern recognition systems: Impact of quality on fingerprint categorization

    Get PDF
    The aspect of quality in pattern classification has recently been explored in the context of biometric identification and authentication systems. The results presented in the literature indicate that incorporating information about quality of the input pattern leads to improved classification performance. The quality itself, however, can be defined in a number of ways, and its role in the various stages of pattern classification is often ambiguous or ad hoc. In this dissertation a more systematic approach to the incorporation of localized quality metrics into the pattern recognition process is developed for the specific task of fingerprint categorization. Quality is defined not as an intrinsic property of the image, but rather in terms of a set of defects introduced to it. A number of fingerprint images have been examined and the important quality defects have been identified and modeled in a mathematically tractable way. The models are flexible and can be used to generate synthetic images that can facilitate algorithm development and large scale, less time consuming performance testing. The effect of quality defects on various stages of the fingerprint recognition process are examined both analytically and empirically. For these defect models, it is shown that the uncertainty of parameter estimates, i.e. extracted fingerprint features, is the key quantity that can be calculated and propagated forward through the stages of the fingerprint classification process. Modified image processing techniques that explicitly utilize local quality metrics in the extraction of features useful in fingerprint classification, such as ridge orientation flow field, are presented and their performance is investigated

    Multi-Scale Edge Detection Algorithms and Their Information-Theoretic Analysis in the Context of Visual Communication

    Get PDF
    The unrealistic assumption that noise can be modeled as independent, additive and uniform can lead to problems when edge detection methods are applied to low signal-to-noise ratio (SNR) images. The main reason for this is because the filter scale and the threshold for the gradient are difficult to determine at a regional or local scale when the noise estimate is on a global scale. Therefore, in this dissertation, we attempt to solve these problems by using more than one filter to detect the edges and discarding the global thresholding method in the edge discrimination. The proposed multi-scale edge detection algorithms utilize the multi-scale description to detect and localize edges. Furthermore, instead of using the single default global threshold, a local dynamic threshold is introduced to discriminate between edges and non-edges. The proposed algorithms also perform connectivity analysis on edge maps to ensure that small, disconnected edges are removed. Experiments where the methods are applied to a sequence of images of the same scene with different SNRs show the methods to be robust to noise. Additionally, a new noise reduction algorithm based on the multi-scale edge analysis is proposed. In general, an edge—high frequency information in an image—would be filtered or suppressed after image smoothing. With the help of multi-scale edge detection algorithms, the overall edge structure of the original image could be preserved when only the isolated edge information that represents noise gets filtered out. Experimental results show that this method is robust to high levels of noise, correctly preserving the edges. We also propose a new method for evaluating the performance of edge detection algorithms. It is based on information-theoretic analysis of the edge detection algorithms in the context of an end-to-end visual communication channel. We use the information between the scene and the output of the edge-detection algorithm, ala Shannon, to evaluate the performance. An edge detection algorithm is considered to have high performance only if the information rate from the scene to the edge approaches the maximum possible. Therefore, this information-theoretic analysis becomes a new method to allow comparison between different edge detection operators for a given end-to-end image processing system
    • …
    corecore