105 research outputs found
A multiresolution framework for local similarity based image denoising
In this paper, we present a generic framework for denoising of images corrupted with additive white Gaussian noise based on the idea of regional similarity. The proposed framework employs a similarity function using the distance between pixels in a multidimensional feature space, whereby multiple feature maps describing various local regional characteristics can be utilized, giving higher weight to pixels having similar regional characteristics. An extension of the proposed framework into a multiresolution setting using wavelets and scale space is presented. It is shown that the resulting multiresolution multilateral (MRM) filtering algorithm not only eliminates the coarse-grain noise but can also faithfully reconstruct anisotropic features, particularly in the presence of high levels of noise
Graph Spectral Image Processing
Recent advent of graph signal processing (GSP) has spurred intensive studies
of signals that live naturally on irregular data kernels described by graphs
(e.g., social networks, wireless sensor networks). Though a digital image
contains pixels that reside on a regularly sampled 2D grid, if one can design
an appropriate underlying graph connecting pixels with weights that reflect the
image structure, then one can interpret the image (or image patch) as a signal
on a graph, and apply GSP tools for processing and analysis of the signal in
graph spectral domain. In this article, we overview recent graph spectral
techniques in GSP specifically for image / video processing. The topics covered
include image compression, image restoration, image filtering and image
segmentation
Flash Photography Enhancement via Intrinsic Relighting
We enhance photographs shot in dark environments by combining a picture taken with the available light and one taken with the flash. We preserve the ambiance of the original lighting and insert the sharpness from the flash image. We use the bilateral filter to decompose the images into detail and large scale. We reconstruct the image using the large scale of the available lighting and the detail of the flash. We detect and correct flash shadows. This combines the advantages of available illumination and flash photography.Singapore-MIT Alliance (SMA
Edge-preserving Multiscale Image Decomposition based on Local Extrema
We propose a new model for detail that inherently captures oscillations, a key property that distinguishes textures from individual edges. Inspired by techniques in empirical data analysis and morphological image analysis, we use the local extrema of the input image to extract information about oscillations: We define detail as oscillations between local minima and maxima. Building on the key observation that the spatial scale of oscillations are characterized by the density of local extrema, we develop an algorithm for decomposing images into multiple scales of superposed oscillations.
Current edge-preserving image decompositions assume image detail to be low contrast variation. Consequently they apply filters that extract features with increasing contrast as successive layers of detail. As a result, they are unable to distinguish between high-contrast, fine-scale features and edges of similar contrast that are to be preserved. We compare our results with existing edge-preserving image decomposition algorithms and demonstrate exciting applications that are made possible by our new notion of detail
Surface Denoising based on Normal Filtering in a Robust Statistics Framework
During a surface acquisition process using 3D scanners, noise is inevitable
and an important step in geometry processing is to remove these noise
components from these surfaces (given as points-set or triangulated mesh). The
noise-removal process (denoising) can be performed by filtering the surface
normals first and by adjusting the vertex positions according to filtered
normals afterwards. Therefore, in many available denoising algorithms, the
computation of noise-free normals is a key factor. A variety of filters have
been introduced for noise-removal from normals, with different focus points
like robustness against outliers or large amplitude of noise. Although these
filters are performing well in different aspects, a unified framework is
missing to establish the relation between them and to provide a theoretical
analysis beyond the performance of each method.
In this paper, we introduce such a framework to establish relations between a
number of widely-used nonlinear filters for face normals in mesh denoising and
vertex normals in point set denoising. We cover robust statistical estimation
with M-smoothers and their application to linear and non-linear normal
filtering. Although these methods originate in different mathematical theories
- which include diffusion-, bilateral-, and directional curvature-based
algorithms - we demonstrate that all of them can be cast into a unified
framework of robust statistics using robust error norms and their corresponding
influence functions. This unification contributes to a better understanding of
the individual methods and their relations with each other. Furthermore, the
presented framework provides a platform for new techniques to combine the
advantages of known filters and to compare them with available methods
- ā¦