7,618 research outputs found

    From primal sketches to the recovery of intensity and reflectance representations

    Get PDF
    A local change in intensity (edge) is a characteristic that is preserved when an image is filtered through a bandpass filter. Primal sketch representations of images, using the bandpass-filtered data, have become a common process since Marr proposed his model for early human vision. Here, researchers move beyond the primal sketch extraction to the recovery of intensity and reflectance representations using only the bandpass-filtered data. Assessing the response of an ideal step edge to the Laplacian of Gaussian (NAb/A squared G) filter, they found that the resulting filtered data preserves the original change of intensity that created the edge in addition to the edge location. Using the filtered data, they can construct the primal sketches and recover the original (relative) intensity levels between the boundaries. It was found that the result of filtering an ideal step edge with the Intensity-Dependent Spatial Summation (IDS) filter preserves the actual intensity on both sides of the edge, in addition to the edge location. The IDS filter also preserves the reflectance ratio at the edge location. Therefore, one can recover the intensity levels between the edge boundaries as well as the (relative) reflectance representation. The recovery of the reflectance representation is of special interest as it erases shadowing degradations and other dependencies on temporal illumination. This method offers a new approach to low-level vision processing as well as to high data-compression coding. High compression can be gained by transmitting only the information associated with the edge location (edge primitives) that is necessary for the recover

    Curse of dimensionality reduction in max-plus based approximation methods: theoretical estimates and improved pruning algorithms

    Full text link
    Max-plus based methods have been recently developed to approximate the value function of possibly high dimensional optimal control problems. A critical step of these methods consists in approximating a function by a supremum of a small number of functions (max-plus "basis functions") taken from a prescribed dictionary. We study several variants of this approximation problem, which we show to be continuous versions of the facility location and kk-center combinatorial optimization problems, in which the connection costs arise from a Bregman distance. We give theoretical error estimates, quantifying the number of basis functions needed to reach a prescribed accuracy. We derive from our approach a refinement of the curse of dimensionality free method introduced previously by McEneaney, with a higher accuracy for a comparable computational cost.Comment: 8pages 5 figure

    Structured Sparsity: Discrete and Convex approaches

    Full text link
    Compressive sensing (CS) exploits sparsity to recover sparse or compressible signals from dimensionality reducing, non-adaptive sensing mechanisms. Sparsity is also used to enhance interpretability in machine learning and statistics applications: While the ambient dimension is vast in modern data analysis problems, the relevant information therein typically resides in a much lower dimensional space. However, many solutions proposed nowadays do not leverage the true underlying structure. Recent results in CS extend the simple sparsity idea to more sophisticated {\em structured} sparsity models, which describe the interdependency between the nonzero components of a signal, allowing to increase the interpretability of the results and lead to better recovery performance. In order to better understand the impact of structured sparsity, in this chapter we analyze the connections between the discrete models and their convex relaxations, highlighting their relative advantages. We start with the general group sparse model and then elaborate on two important special cases: the dispersive and the hierarchical models. For each, we present the models in their discrete nature, discuss how to solve the ensuing discrete problems and then describe convex relaxations. We also consider more general structures as defined by set functions and present their convex proxies. Further, we discuss efficient optimization solutions for structured sparsity problems and illustrate structured sparsity in action via three applications.Comment: 30 pages, 18 figure
    • …
    corecore