20,091 research outputs found

    Sparse Modeling for Image and Vision Processing

    Get PDF
    In recent years, a large amount of multi-disciplinary research has been conducted on sparse models and their applications. In statistics and machine learning, the sparsity principle is used to perform model selection---that is, automatically selecting a simple model among a large collection of them. In signal processing, sparse coding consists of representing data with linear combinations of a few dictionary elements. Subsequently, the corresponding tools have been widely adopted by several scientific communities such as neuroscience, bioinformatics, or computer vision. The goal of this monograph is to offer a self-contained view of sparse modeling for visual recognition and image processing. More specifically, we focus on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts.Comment: 205 pages, to appear in Foundations and Trends in Computer Graphics and Visio

    Self-Paced Learning: an Implicit Regularization Perspective

    Full text link
    Self-paced learning (SPL) mimics the cognitive mechanism of humans and animals that gradually learns from easy to hard samples. One key issue in SPL is to obtain better weighting strategy that is determined by minimizer function. Existing methods usually pursue this by artificially designing the explicit form of SPL regularizer. In this paper, we focus on the minimizer function, and study a group of new regularizer, named self-paced implicit regularizer that is deduced from robust loss function. Based on the convex conjugacy theory, the minimizer function for self-paced implicit regularizer can be directly learned from the latent loss function, while the analytic form of the regularizer can be even known. A general framework (named SPL-IR) for SPL is developed accordingly. We demonstrate that the learning procedure of SPL-IR is associated with latent robust loss functions, thus can provide some theoretical inspirations for its working mechanism. We further analyze the relation between SPL-IR and half-quadratic optimization. Finally, we implement SPL-IR to both supervised and unsupervised tasks, and experimental results corroborate our ideas and demonstrate the correctness and effectiveness of implicit regularizers.Comment: 12 pages, 3 figure

    Parametric Regression on the Grassmannian

    Get PDF
    We address the problem of fitting parametric curves on the Grassmann manifold for the purpose of intrinsic parametric regression. As customary in the literature, we start from the energy minimization formulation of linear least-squares in Euclidean spaces and generalize this concept to general nonflat Riemannian manifolds, following an optimal-control point of view. We then specialize this idea to the Grassmann manifold and demonstrate that it yields a simple, extensible and easy-to-implement solution to the parametric regression problem. In fact, it allows us to extend the basic geodesic model to (1) a time-warped variant and (2) cubic splines. We demonstrate the utility of the proposed solution on different vision problems, such as shape regression as a function of age, traffic-speed estimation and crowd-counting from surveillance video clips. Most notably, these problems can be conveniently solved within the same framework without any specifically-tailored steps along the processing pipeline.Comment: 14 pages, 11 figure

    Augmented Sparse Reconstruction of Protein Signaling Networks

    Full text link
    The problem of reconstructing and identifying intracellular protein signaling and biochemical networks is of critical importance in biology today. We sought to develop a mathematical approach to this problem using, as a test case, one of the most well-studied and clinically important signaling networks in biology today, the epidermal growth factor receptor (EGFR) driven signaling cascade. More specifically, we suggest a method, augmented sparse reconstruction, for the identification of links among nodes of ordinary differential equation (ODE) networks from a small set of trajectories with different initial conditions. Our method builds a system of representation by using a collection of integrals of all given trajectories and by attenuating block of terms in the representation itself. The system of representation is then augmented with random vectors, and minimization of the 1-norm is used to find sparse representations for the dynamical interactions of each node. Augmentation by random vectors is crucial, since sparsity alone is not able to handle the large error-in-variables in the representation. Augmented sparse reconstruction allows to consider potentially very large spaces of models and it is able to detect with high accuracy the few relevant links among nodes, even when moderate noise is added to the measured trajectories. After showing the performance of our method on a model of the EGFR protein network, we sketch briefly the potential future therapeutic applications of this approach.Comment: 24 pages, 6 figure

    Why do ultrasoft repulsive particles cluster and crystallize? Analytical results from density functional theory

    Full text link
    We demonstrate the accuracy of the hypernetted chain closure and of the mean-field approximation for the calculation of the fluid-state properties of systems interacting by means of bounded and positive-definite pair potentials with oscillating Fourier transforms. Subsequently, we prove the validity of a bilinear, random-phase density functional for arbitrary inhomogeneous phases of the same systems. On the basis of this functional, we calculate analytically the freezing parameters of the latter. We demonstrate explicitly that the stable crystals feature a lattice constant that is independent of density and whose value is dictated by the position of the negative minimum of the Fourier transform of the pair potential. This property is equivalent with the existence of clusters, whose population scales proportionally to the density. We establish that regardless of the form of the interaction potential and of the location on the freezing line, all cluster crystals have a universal Lindemann ratio L = 0.189 at freezing. We further make an explicit link between the aforementioned density functional and the harmonic theory of crystals. This allows us to establish an equivalence between the emergence of clusters and the existence of negative Fourier components of the interaction potential. Finally, we make a connection between the class of models at hand and the system of infinite-dimensional hard spheres, when the limits of interaction steepness and space dimension are both taken to infinity in a particularly described fashion.Comment: 19 pages, 5 figures, submitted to J. Chem. Phys; new version: minor changes in structure of pape

    Socially Constrained Structural Learning for Groups Detection in Crowd

    Full text link
    Modern crowd theories agree that collective behavior is the result of the underlying interactions among small groups of individuals. In this work, we propose a novel algorithm for detecting social groups in crowds by means of a Correlation Clustering procedure on people trajectories. The affinity between crowd members is learned through an online formulation of the Structural SVM framework and a set of specifically designed features characterizing both their physical and social identity, inspired by Proxemic theory, Granger causality, DTW and Heat-maps. To adhere to sociological observations, we introduce a loss function (G-MITRE) able to deal with the complexity of evaluating group detection performances. We show our algorithm achieves state-of-the-art results when relying on both ground truth trajectories and tracklets previously extracted by available detector/tracker systems
    • …
    corecore