3,758 research outputs found

    High-ISO long-exposure image denoising based on quantitative blob characterization

    Get PDF
    Blob detection and image denoising are fundamental, sometimes related tasks in computer vision. In this paper, we present a computational method to quantitatively measure blob characteristics using normalized unilateral second-order Gaussian kernels. This method suppresses non-blob structures while yielding a quantitative measurement of the position, prominence and scale of blobs, which can facilitate the tasks of blob reconstruction and blob reduction. Subsequently, we propose a denoising scheme to address high-ISO long-exposure noise, which sometimes spatially shows a blob appearance, employing a blob reduction procedure as a cheap preprocessing for conventional denoising methods. We apply the proposed denoising methods to real-world noisy images as well as standard images that are corrupted by real noise. The experimental results demonstrate the superiority of the proposed methods over state-of-the-art denoising methods

    Adaptive Nonlocal Filtering: A Fast Alternative to Anisotropic Diffusion for Image Enhancement

    Full text link
    The goal of many early visual filtering processes is to remove noise while at the same time sharpening contrast. An historical succession of approaches to this problem, starting with the use of simple derivative and smoothing operators, and the subsequent realization of the relationship between scale-space and the isotropic dfffusion equation, has recently resulted in the development of "geometry-driven" dfffusion. Nonlinear and anisotropic diffusion methods, as well as image-driven nonlinear filtering, have provided improved performance relative to the older isotropic and linear diffusion techniques. These techniques, which either explicitly or implicitly make use of kernels whose shape and center are functions of local image structure are too computationally expensive for use in real-time vision applications. In this paper, we show that results which are largely equivalent to those obtained from geometry-driven diffusion can be achieved by a process which is conceptually separated info two very different functions. The first involves the construction of a vector~field of "offsets", defined on a subset of the original image, at which to apply a filter. The offsets are used to displace filters away from boundaries to prevent edge blurring and destruction. The second is the (straightforward) application of the filter itself. The former function is a kind generalized image skeletonization; the latter is conventional image filtering. This formulation leads to results which are qualitatively similar to contemporary nonlinear diffusion methods, but at computation times that are roughly two orders of magnitude faster; allowing applications of this technique to real-time imaging. An additional advantage of this formulation is that it allows existing filter hardware and software implementations to be applied with no modification, since the offset step reduces to an image pixel permutation, or look-up table operation, after application of the filter

    Finding faint HI structure in and around galaxies: scraping the barrel

    Get PDF
    Soon to be operational HI survey instruments such as APERTIF and ASKAP will produce large datasets. These surveys will provide information about the HI in and around hundreds of galaxies with a typical signal-to-noise ratio of āˆ¼\sim 10 in the inner regions and āˆ¼\sim 1 in the outer regions. In addition, such surveys will make it possible to probe faint HI structures, typically located in the vicinity of galaxies, such as extra-planar-gas, tails and filaments. These structures are crucial for understanding galaxy evolution, particularly when they are studied in relation to the local environment. Our aim is to find optimized kernels for the discovery of faint and morphologically complex HI structures. Therefore, using HI data from a variety of galaxies, we explore state-of-the-art filtering algorithms. We show that the intensity-driven gradient filter, due to its adaptive characteristics, is the optimal choice. In fact, this filter requires only minimal tuning of the input parameters to enhance the signal-to-noise ratio of faint components. In addition, it does not degrade the resolution of the high signal-to-noise component of a source. The filtering process must be fast and be embedded in an interactive visualization tool in order to support fast inspection of a large number of sources. To achieve such interactive exploration, we implemented a multi-core CPU (OpenMP) and a GPU (OpenGL) version of this filter in a 3D visualization environment (SlicerAstro\tt{SlicerAstro}).Comment: 17 pages, 9 figures, 4 tables. Astronomy and Computing, accepte

    A network for multiscale image segmentation

    Get PDF
    Detecting edges of objects in their images is a basic problem in computational vision. The scale-space technique introduced by Witkin [11] provides means of using local and global reasoning in locating edges. This approach has a major drawback: it is difficult to obtain accurately the locations of the 'semantically meaningful' edges. We have refined the definition of scale-space, and introduced a class of algorithms for implementing it based on using anisotropic diffusion [9]. The algorithms involves simple, local operations replicated over the image making parallel hardware implementation feasible. In this paper we present the major ideas behind the use of scale space, and anisotropic diffusion for edge detection, we show that anisotropic diffusion can enhance edges, we suggest a network implementation of anisotropic diffusion, and provide design criteria for obtaining networks performing scale space, and edge detection. The results of a software implementation are shown

    A multiresolution framework for local similarity based image denoising

    Get PDF
    In this paper, we present a generic framework for denoising of images corrupted with additive white Gaussian noise based on the idea of regional similarity. The proposed framework employs a similarity function using the distance between pixels in a multidimensional feature space, whereby multiple feature maps describing various local regional characteristics can be utilized, giving higher weight to pixels having similar regional characteristics. An extension of the proposed framework into a multiresolution setting using wavelets and scale space is presented. It is shown that the resulting multiresolution multilateral (MRM) filtering algorithm not only eliminates the coarse-grain noise but can also faithfully reconstruct anisotropic features, particularly in the presence of high levels of noise
    • ā€¦
    corecore