12 research outputs found

    Moments-Based Fast Wedgelet Transform

    Get PDF
    In the paper the moments-based fast wedgelet transform has been presented. In order to perform the classical wedgelet transform one searches the whole wedgelets’ dictionary to find the best matching. Whereas in the proposed method the parameters of wedgelet are computed directly from an image basing on moments computation. Such parameters describe wedgelet reflecting the edge present in the image. However, such wedgelet is not necessarily the best one in the meaning of Mean Square Error. So, to overcome that drawback, the method which improves the matching result has also been proposed. It works in the way that the better matching one needs to obtain the longer time it takes. The proposed transform works in linear time with respect to the number of pixels of the full quadtree decomposition of an image. More precisely, for an image of size N ×N pixels the time complexity of the proposed wedgelet transform is O(N2 log2 N)

    Locally adaptive image denoising by a statistical multiresolution criterion

    Full text link
    We demonstrate how one can choose the smoothing parameter in image denoising by a statistical multiresolution criterion, both globally and locally. Using inhomogeneous diffusion and total variation regularization as examples for localized regularization schemes, we present an efficient method for locally adaptive image denoising. As expected, the smoothing parameter serves as an edge detector in this framework. Numerical examples illustrate the usefulness of our approach. We also present an application in confocal microscopy

    The Multiplicative Zak Transform, Dimension Reduction, and Wavelet Analysis of LIDAR Data

    Get PDF
    This thesis broadly introduces several techniques within the context of timescale analysis. The representation, compression and reconstruction of DEM and LIDAR data types is studied with directional wavelet methods and the wedgelet decomposition. The optimality of the contourlet transform, and then the wedgelet transform is evaluated with a valuable new structural similarity index. Dimension reduction for material classification is conducted with a frame-based kernel pipeline and a spectral-spatial method using wavelet packets. It is shown that these techniques can improve on baseline material classification methods while significantly reducing the amount of data. Finally, the multiplicative Zak transform is modified to allow the study and partial characterization of wavelet frames

    Image coding using wavelets, interval wavelets and multi- layered wedgelets

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    A Review of Adaptive Image Representations

    Full text link

    The L1-Potts functional for robust jump-sparse reconstruction

    Full text link
    We investigate the non-smooth and non-convex L1L^1-Potts functional in discrete and continuous time. We show Γ\Gamma-convergence of discrete L1L^1-Potts functionals towards their continuous counterpart and obtain a convergence statement for the corresponding minimizers as the discretization gets finer. For the discrete L1L^1-Potts problem, we introduce an O(n2)O(n^2) time and O(n)O(n) space algorithm to compute an exact minimizer. We apply L1L^1-Potts minimization to the problem of recovering piecewise constant signals from noisy measurements f.f. It turns out that the L1L^1-Potts functional has a quite interesting blind deconvolution property. In fact, we show that mildly blurred jump-sparse signals are reconstructed by minimizing the L1L^1-Potts functional. Furthermore, for strongly blurred signals and known blurring operator, we derive an iterative reconstruction algorithm

    Approximating Data with weighted smoothing Splines

    Full text link
    Given a data set (t_i, y_i), i=1,..., n with the t_i in [0,1] non-parametric regression is concerned with the problem of specifying a suitable function f_n:[0,1] -> R such that the data can be reasonably approximated by the points (t_i, f_n(t_i)), i=1,..., n. If a data set exhibits large variations in local behaviour, for example large peaks as in spectroscopy data, then the method must be able to adapt to the local changes in smoothness. Whilst many methods are able to accomplish this they are less successful at adapting derivatives. In this paper we show how the goal of local adaptivity of the function and its first and second derivatives can be attained in a simple manner using weighted smoothing splines. A residual based concept of approximation is used which forces local adaptivity of the regression function together with a global regularization which makes the function as smooth as possible subject to the approximation constraints

    Information selection and fusion in vision systems

    Get PDF
    Handling the enormous amounts of data produced by data-intensive imaging systems, such as multi-camera surveillance systems and microscopes, is technically challenging. While image and video compression help to manage the data volumes, they do not address the basic problem of information overflow. In this PhD we tackle the problem in a more drastic way. We select information of interest to a specific vision task, and discard the rest. We also combine data from different sources into a single output product, which presents the information of interest to end users in a suitable, summarized format. We treat two types of vision systems. The first type is conventional light microscopes. During this PhD, we have exploited for the first time the potential of the curvelet transform for image fusion for depth-of-field extension, allowing us to combine the advantages of multi-resolution image analysis for image fusion with increased directional sensitivity. As a result, the proposed technique clearly outperforms state-of-the-art methods, both on real microscopy data and on artificially generated images. The second type is camera networks with overlapping fields of view. To enable joint processing in such networks, inter-camera communication is essential. Because of infrastructure costs, power consumption for wireless transmission, etc., transmitting high-bandwidth video streams between cameras should be avoided. Fortunately, recently designed 'smart cameras', which have on-board processing and communication hardware, allow distributing the required image processing over the cameras. This permits compactly representing useful information from each camera. We focus on representing information for people localization and observation, which are important tools for statistical analysis of room usage, quick localization of people in case of building fires, etc. To further save bandwidth, we select which cameras should be involved in a vision task and transmit observations only from the selected cameras. We provide an information-theoretically founded framework for general purpose camera selection based on the Dempster-Shafer theory of evidence. Applied to tracking, it allows tracking people using a dynamic selection of as little as three cameras with the same accuracy as when using up to ten cameras

    EFFICIENT MOMENT COMPUTATION OVER POLYGONAL DOMAINS WITH AN APPLICATION TO RAPID WEDGELET APPROXIMATION

    No full text
    Abstract. Many algorithms in image processing rely on the computation of sums of pixel values over a large variety of subsets of the image domain. This includes the computation of image moments for pattern recognition purposes, or adaptive smoothing and regression methods, such as wedgelets. In the first part of the paper, we present a general method which allows the fast computation of sums over a large class of polygonal domain. The approach relies on the idea of considering polygonal domains with a fixed angular resolution, combined with an efficient implementation of a discrete version of Green’s theorem. The second part deals with the application of the new methodology to a particular computational problem, namely wedgelet approximation. Our technique results in a speedup of O(10 3) by comparison to preexisting implementations. A further attractive feature of our implementation is the instantaneous access to the full scale of wedgelet minimizers. We introduce a new scheme that replaces the locally constant regression underlying wedgelets by basically arbitrary local regression models. Due to the speedup obtained by the techniques explained in the first part, this scheme is computationally efficient, and at the same time much more flexible than previously suggested methods such as wedgelets or platelets. In the final section we present numerical experiments showing the increase in speed and flexibility. Key words. Wedgelets, platelets, image approximation, image moments, polygonal domains, discrete Green’s theorem, digital lines. AMS subject classifications. 68U10, 65K10, 26B20, 52C99
    corecore