9,622 research outputs found

    On Nonrigid Shape Similarity and Correspondence

    Full text link
    An important operation in geometry processing is finding the correspondences between pairs of shapes. The Gromov-Hausdorff distance, a measure of dissimilarity between metric spaces, has been found to be highly useful for nonrigid shape comparison. Here, we explore the applicability of related shape similarity measures to the problem of shape correspondence, adopting spectral type distances. We propose to evaluate the spectral kernel distance, the spectral embedding distance and the novel spectral quasi-conformal distance, comparing the manifolds from different viewpoints. By matching the shapes in the spectral domain, important attributes of surface structure are being aligned. For the purpose of testing our ideas, we introduce a fully automatic framework for finding intrinsic correspondence between two shapes. The proposed method achieves state-of-the-art results on the Princeton isometric shape matching protocol applied, as usual, to the TOSCA and SCAPE benchmarks

    Weighted Mean Curvature

    Full text link
    In image processing tasks, spatial priors are essential for robust computations, regularization, algorithmic design and Bayesian inference. In this paper, we introduce weighted mean curvature (WMC) as a novel image prior and present an efficient computation scheme for its discretization in practical image processing applications. We first demonstrate the favorable properties of WMC, such as sampling invariance, scale invariance, and contrast invariance with Gaussian noise model; and we show the relation of WMC to area regularization. We further propose an efficient computation scheme for discretized WMC, which is demonstrated herein to process over 33.2 giga-pixels/second on GPU. This scheme yields itself to a convolutional neural network representation. Finally, WMC is evaluated on synthetic and real images, showing its superiority quantitatively to total-variation and mean curvature.Comment: 12 page

    The Schr\"oder functional equation and its relation to the invariant measures of chaotic maps

    Full text link
    The aim of this paper is to show that the invariant measure for a class of one dimensional chaotic maps, T(x)T(x), is an extended solution of the Schr\"oder functional equation, q(T(x))=λq(x)q(T(x))=\lambda q(x), induced by them. Hence, we give an unified treatment of a collection of exactly solved examples worked out in the current literature. In particular, we show that these examples belongs to a class of functions introduced by Mira, (see text). Moreover, as a new example, we compute the invariant densities for a class of rational maps having the Weierstrass ℘\wp functions as an invariant one. Also, we study the relation between that equation and the well known Frobenius-Perron and Koopman's operators.Comment: 9 page

    Optimal Kullback-Leibler Aggregation via Information Bottleneck

    Full text link
    In this paper, we present a method for reducing a regular, discrete-time Markov chain (DTMC) to another DTMC with a given, typically much smaller number of states. The cost of reduction is defined as the Kullback-Leibler divergence rate between a projection of the original process through a partition function and a DTMC on the correspondingly partitioned state space. Finding the reduced model with minimal cost is computationally expensive, as it requires an exhaustive search among all state space partitions, and an exact evaluation of the reduction cost for each candidate partition. Our approach deals with the latter problem by minimizing an upper bound on the reduction cost instead of minimizing the exact cost; The proposed upper bound is easy to compute and it is tight if the original chain is lumpable with respect to the partition. Then, we express the problem in the form of information bottleneck optimization, and propose using the agglomerative information bottleneck algorithm for searching a sub-optimal partition greedily, rather than exhaustively. The theory is illustrated with examples and one application scenario in the context of modeling bio-molecular interactions.Comment: 13 pages, 4 figure

    Spectral Generalized Multi-Dimensional Scaling

    Full text link
    Multidimensional scaling (MDS) is a family of methods that embed a given set of points into a simple, usually flat, domain. The points are assumed to be sampled from some metric space, and the mapping attempts to preserve the distances between each pair of points in the set. Distances in the target space can be computed analytically in this setting. Generalized MDS is an extension that allows mapping one metric space into another, that is, multidimensional scaling into target spaces in which distances are evaluated numerically rather than analytically. Here, we propose an efficient approach for computing such mappings between surfaces based on their natural spectral decomposition, where the surfaces are treated as sampled metric-spaces. The resulting spectral-GMDS procedure enables efficient embedding by implicitly incorporating smoothness of the mapping into the problem, thereby substantially reducing the complexity involved in its solution while practically overcoming its non-convex nature. The method is compared to existing techniques that compute dense correspondence between shapes. Numerical experiments of the proposed method demonstrate its efficiency and accuracy compared to state-of-the-art approaches
    • …
    corecore