687 research outputs found

    Optimality of Operator-Like Wavelets for Representing Sparse AR(1) Processes

    Full text link

    Sparse Modeling for Image and Vision Processing

    Get PDF
    In recent years, a large amount of multi-disciplinary research has been conducted on sparse models and their applications. In statistics and machine learning, the sparsity principle is used to perform model selection---that is, automatically selecting a simple model among a large collection of them. In signal processing, sparse coding consists of representing data with linear combinations of a few dictionary elements. Subsequently, the corresponding tools have been widely adopted by several scientific communities such as neuroscience, bioinformatics, or computer vision. The goal of this monograph is to offer a self-contained view of sparse modeling for visual recognition and image processing. More specifically, we focus on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts.Comment: 205 pages, to appear in Foundations and Trends in Computer Graphics and Visio

    On stable reconstructions from nonuniform Fourier measurements

    Full text link
    We consider the problem of recovering a compactly-supported function from a finite collection of pointwise samples of its Fourier transform taking nonuniformly. First, we show that under suitable conditions on the sampling frequencies - specifically, their density and bandwidth - it is possible to recover any such function ff in a stable and accurate manner in any given finite-dimensional subspace; in particular, one which is well suited for approximating ff. In practice, this is carried out using so-called nonuniform generalized sampling (NUGS). Second, we consider approximation spaces in one dimension consisting of compactly supported wavelets. We prove that a linear scaling of the dimension of the space with the sampling bandwidth is both necessary and sufficient for stable and accurate recovery. Thus wavelets are up to constant factors optimal spaces for reconstruction

    Wavelets: mathematics and applications

    Full text link
    The notion of wavelets is defined. It is briefly described {\it what} are wavelets, {\it how} to use them, {\it when} we do need them, {\it why} they are preferred and {\it where} they have been applied. Then one proceeds to the multiresolution analysis and fast wavelet transform as a standard procedure for dealing with discrete wavelets. It is shown which specific features of signals (functions) can be revealed by this analysis, but can not be found by other methods (e.g., by the Fourier expansion). Finally, some examples of practical application are given (in particular, to analysis of multiparticle production}. Rigorous proofs of mathematical statements are omitted, and the reader is referred to the corresponding literature.Comment: 16 pages, 5 figures, Latex, Phys. Atom. Nuc

    Convex and Network Flow Optimization for Structured Sparsity

    Get PDF
    We consider a class of learning problems regularized by a structured sparsity-inducing norm defined as the sum of l_2- or l_infinity-norms over groups of variables. Whereas much effort has been put in developing fast optimization techniques when the groups are disjoint or embedded in a hierarchy, we address here the case of general overlapping groups. To this end, we present two different strategies: On the one hand, we show that the proximal operator associated with a sum of l_infinity-norms can be computed exactly in polynomial time by solving a quadratic min-cost flow problem, allowing the use of accelerated proximal gradient methods. On the other hand, we use proximal splitting techniques, and address an equivalent formulation with non-overlapping groups, but in higher dimension and with additional constraints. We propose efficient and scalable algorithms exploiting these two strategies, which are significantly faster than alternative approaches. We illustrate these methods with several problems such as CUR matrix factorization, multi-task learning of tree-structured dictionaries, background subtraction in video sequences, image denoising with wavelets, and topographic dictionary learning of natural image patches.Comment: to appear in the Journal of Machine Learning Research (JMLR

    Graph Signal Processing: Overview, Challenges and Applications

    Full text link
    Research in Graph Signal Processing (GSP) aims to develop tools for processing data defined on irregular graph domains. In this paper we first provide an overview of core ideas in GSP and their connection to conventional digital signal processing. We then summarize recent developments in developing basic GSP tools, including methods for sampling, filtering or graph learning. Next, we review progress in several application areas using GSP, including processing and analysis of sensor network data, biological data, and applications to image processing and machine learning. We finish by providing a brief historical perspective to highlight how concepts recently developed in GSP build on top of prior research in other areas.Comment: To appear, Proceedings of the IEE
    • …
    corecore