24,560 research outputs found

    End-to-End Kernel Learning with Supervised Convolutional Kernel Networks

    Get PDF
    In this paper, we introduce a new image representation based on a multilayer kernel machine. Unlike traditional kernel methods where data representation is decoupled from the prediction task, we learn how to shape the kernel with supervision. We proceed by first proposing improvements of the recently-introduced convolutional kernel networks (CKNs) in the context of unsupervised learning; then, we derive backpropagation rules to take advantage of labeled training data. The resulting model is a new type of convolutional neural network, where optimizing the filters at each layer is equivalent to learning a linear subspace in a reproducing kernel Hilbert space (RKHS). We show that our method achieves reasonably competitive performance for image classification on some standard "deep learning" datasets such as CIFAR-10 and SVHN, and also for image super-resolution, demonstrating the applicability of our approach to a large variety of image-related tasks.Comment: to appear in Advances in Neural Information Processing Systems (NIPS

    Sparse Modeling for Image and Vision Processing

    Get PDF
    In recent years, a large amount of multi-disciplinary research has been conducted on sparse models and their applications. In statistics and machine learning, the sparsity principle is used to perform model selection---that is, automatically selecting a simple model among a large collection of them. In signal processing, sparse coding consists of representing data with linear combinations of a few dictionary elements. Subsequently, the corresponding tools have been widely adopted by several scientific communities such as neuroscience, bioinformatics, or computer vision. The goal of this monograph is to offer a self-contained view of sparse modeling for visual recognition and image processing. More specifically, we focus on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts.Comment: 205 pages, to appear in Foundations and Trends in Computer Graphics and Visio

    Generalized Forward-Backward Splitting with Penalization for Monotone Inclusion Problems

    Full text link
    We introduce a generalized forward-backward splitting method with penalty term for solving monotone inclusion problems involving the sum of a finite number of maximally monotone operators and the normal cone to the nonempty set of zeros of another maximal monotone operator. We show weak ergodic convergence of the generated sequence of iterates to a solution of the considered monotone inclusion problem, provided the condition corresponded to the Fitzpatrick function of the operator describing the set of the normal cone is fulfilled. Under strong monotonicity of an operator, we show strong convergence of the iterates. Furthermore, we utilize the proposed method for minimizing a large-scale hierarchical minimization problem concerning the sum of differentiable and nondifferentiable convex functions subject to the set of minima of another differentiable convex function. We illustrate the functionality of the method through numerical experiments addressing constrained elastic net and generalized Heron location problems
    • …
    corecore