2,718 research outputs found
Monotone properties of random geometric graphs have sharp thresholds
Random geometric graphs result from taking uniformly distributed points
in the unit cube, , and connecting two points if their Euclidean
distance is at most , for some prescribed . We show that monotone
properties for this class of graphs have sharp thresholds by reducing the
problem to bounding the bottleneck matching on two sets of points
distributed uniformly in . We present upper bounds on the threshold
width, and show that our bound is sharp for and at most a sublogarithmic
factor away for . Interestingly, the threshold width is much sharper for
random geometric graphs than for Bernoulli random graphs. Further, a random
geometric graph is shown to be a subgraph, with high probability, of another
independently drawn random geometric graph with a slightly larger radius; this
property is shown to have no analogue for Bernoulli random graphs.Comment: Published at http://dx.doi.org/10.1214/105051605000000575 in the
Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute
of Mathematical Statistics (http://www.imstat.org
Maximum Scatter TSP in Doubling Metrics
We study the problem of finding a tour of points in which every edge is
long. More precisely, we wish to find a tour that visits every point exactly
once, maximizing the length of the shortest edge in the tour. The problem is
known as Maximum Scatter TSP, and was introduced by Arkin et al. (SODA 1997),
motivated by applications in manufacturing and medical imaging. Arkin et al.
gave a -approximation for the metric version of the problem and showed
that this is the best possible ratio achievable in polynomial time (assuming ). Arkin et al. raised the question of whether a better approximation
ratio can be obtained in the Euclidean plane.
We answer this question in the affirmative in a more general setting, by
giving a -approximation algorithm for -dimensional doubling
metrics, with running time , where . As a corollary we obtain (i) an
efficient polynomial-time approximation scheme (EPTAS) for all constant
dimensions , (ii) a polynomial-time approximation scheme (PTAS) for
dimension , for a sufficiently large constant , and (iii)
a PTAS for constant and . Furthermore, we
show the dependence on in our approximation scheme to be essentially
optimal, unless Satisfiability can be solved in subexponential time
Mismatch and resolution in compressive imaging
Highly coherent sensing matrices arise in discretization of continuum
problems such as radar and medical imaging when the grid spacing is below the
Rayleigh threshold as well as in using highly coherent, redundant dictionaries
as sparsifying operators. Algorithms (BOMP, BLOOMP) based on techniques of band
exclusion and local optimization are proposed to enhance Orthogonal Matching
Pursuit (OMP) and deal with such coherent sensing matrices. BOMP and BLOOMP
have provably performance guarantee of reconstructing sparse, widely separated
objects {\em independent} of the redundancy and have a sparsity constraint and
computational cost similar to OMP's. Numerical study demonstrates the
effectiveness of BLOOMP for compressed sensing with highly coherent, redundant
sensing matrices.Comment: Figure 5 revise
Sketching Persistence Diagrams
Given a persistence diagram with n points, we give an algorithm that produces a sequence of n persistence diagrams converging in bottleneck distance to the input diagram, the ith of which has i distinct (weighted) points and is a 2-approximation to the closest persistence diagram with that many distinct points. For each approximation, we precompute the optimal matching between the ith and the (i+1)st. Perhaps surprisingly, the entire sequence of diagrams as well as the sequence of matchings can be represented in O(n) space. The main approach is to use a variation of the greedy permutation of the persistence diagram to give good Hausdorff approximations and assign weights to these subsets. We give a new algorithm to efficiently compute this permutation, despite the high implicit dimension of points in a persistence diagram due to the effect of the diagonal. The sketches are also structured to permit fast (linear time) approximations to the Hausdorff distance between diagrams - a lower bound on the bottleneck distance. For approximating the bottleneck distance, sketches can also be used to compute a linear-size neighborhood graph directly, obviating the need for geometric data structures used in state-of-the-art methods for bottleneck computation
matching, interpolation, and approximation ; a survey
In this survey we consider geometric techniques which have been used to
measure the similarity or distance between shapes, as well as to approximate
shapes, or interpolate between shapes. Shape is a modality which plays a key
role in many disciplines, ranging from computer vision to molecular biology.
We focus on algorithmic techniques based on computational geometry that have
been developed for shape matching, simplification, and morphing
Compression for Smooth Shape Analysis
Most 3D shape analysis methods use triangular meshes to discretize both the
shape and functions on it as piecewise linear functions. With this
representation, shape analysis requires fine meshes to represent smooth shapes
and geometric operators like normals, curvatures, or Laplace-Beltrami
eigenfunctions at large computational and memory costs.
We avoid this bottleneck with a compression technique that represents a
smooth shape as subdivision surfaces and exploits the subdivision scheme to
parametrize smooth functions on that shape with a few control parameters. This
compression does not affect the accuracy of the Laplace-Beltrami operator and
its eigenfunctions and allow us to compute shape descriptors and shape
matchings at an accuracy comparable to triangular meshes but a fraction of the
computational cost.
Our framework can also compress surfaces represented by point clouds to do
shape analysis of 3D scanning data
- …