142 research outputs found

    Progressive Simplification of Polygonal Curves

    Get PDF
    Simplifying polygonal curves at different levels of detail is an important problem with many applications. Existing geometric optimization algorithms are only capable of minimizing the complexity of a simplified curve for a single level of detail. We present an O(n3m)O(n^3m)-time algorithm that takes a polygonal curve of n vertices and produces a set of consistent simplifications for m scales while minimizing the cumulative simplification complexity. This algorithm is compatible with distance measures such as the Hausdorff, the Fr\'echet and area-based distances, and enables simplification for continuous scaling in O(n5)O(n^5) time. To speed up this algorithm in practice, we present new techniques for constructing and representing so-called shortcut graphs. Experimental evaluation of these techniques on trajectory data reveals a significant improvement of using shortcut graphs for progressive and non-progressive curve simplification, both in terms of running time and memory usage.Comment: 20 pages, 20 figure

    Computing the Similarity Between Moving Curves

    Get PDF
    In this paper we study similarity measures for moving curves which can, for example, model changing coastlines or retreating glacier termini. Points on a moving curve have two parameters, namely the position along the curve as well as time. We therefore focus on similarity measures for surfaces, specifically the Fr\'echet distance between surfaces. While the Fr\'echet distance between surfaces is not even known to be computable, we show for variants arising in the context of moving curves that they are polynomial-time solvable or NP-complete depending on the restrictions imposed on how the moving curves are matched. We achieve the polynomial-time solutions by a novel approach for computing a surface in the so-called free-space diagram based on max-flow min-cut duality

    Fast Frechet Distance Between Curves With Long Edges

    Full text link
    Computing the Fr\'echet distance between two polygonal curves takes roughly quadratic time. In this paper, we show that for a special class of curves the Fr\'echet distance computations become easier. Let PP and QQ be two polygonal curves in Rd\mathbb{R}^d with nn and mm vertices, respectively. We prove four results for the case when all edges of both curves are long compared to the Fr\'echet distance between them: (1) a linear-time algorithm for deciding the Fr\'echet distance between two curves, (2) an algorithm that computes the Fr\'echet distance in O((n+m)log(n+m))O((n+m)\log (n+m)) time, (3) a linear-time d\sqrt{d}-approximation algorithm, and (4) a data structure that supports O(mlog2n)O(m\log^2 n)-time decision queries, where mm is the number of vertices of the query curve and nn the number of vertices of the preprocessed curve

    Approximating the Packedness of Polygonal Curves

    Get PDF
    In 2012 Driemel et al. \cite{DBLP:journals/dcg/DriemelHW12} introduced the concept of cc-packed curves as a realistic input model. In the case when cc is a constant they gave a near linear time (1+ε)(1+\varepsilon)-approximation algorithm for computing the Fr\'echet distance between two cc-packed polygonal curves. Since then a number of papers have used the model. In this paper we consider the problem of computing the smallest cc for which a given polygonal curve in Rd\mathbb{R}^d is cc-packed. We present two approximation algorithms. The first algorithm is a 22-approximation algorithm and runs in O(dn2logn)O(dn^2 \log n) time. In the case d=2d=2 we develop a faster algorithm that returns a (6+ε)(6+\varepsilon)-approximation and runs in O((n/ε3)4/3polylog(n/ε)))O((n/\varepsilon^3)^{4/3} polylog (n/\varepsilon))) time. We also implemented the first algorithm and computed the approximate packedness-value for 16 sets of real-world trajectories. The experiments indicate that the notion of cc-packedness is a useful realistic input model for many curves and trajectories.Comment: A preliminary version to appear in ISAAC 202

    Trajectory Similarity Measurement: An Efficiency Perspective

    Full text link
    Trajectories that capture object movement have numerous applications, in which similarity computation between trajectories often plays a key role. Traditionally, the similarity between two trajectories is quantified by means of heuristic measures, e.g., Hausdorff or ERP, that operate directly on the trajectories. In contrast, recent studies exploit deep learning to map trajectories to d-dimensional vectors, called embeddings. Then, some distance measure, e.g., Manhattan or Euclidean, is applied to the embeddings to quantify trajectory similarity. The resulting similarities are inaccurate: they only approximate the similarities obtained using the heuristic measures. As distance computation on embeddings is efficient, focus has been on achieving embeddings yielding high accuracy. Adopting an efficiency perspective, we analyze the time complexities of both the heuristic and the learning-based approaches, finding that the time complexities of the former approaches are not necessarily higher. Through extensive experiments on open datasets, we find that, on both CPUs and GPUs, only a few learning-based approaches can deliver the promised higher efficiency, when the embeddings can be pre-computed, while heuristic approaches are more efficient for one-off computations. Among the learning-based approaches, the self-attention-based ones are the fastest to learn embeddings that also yield the highest accuracy for similarity queries. These results have implications for the use of trajectory similarity approaches given different application requirements

    The {VC} Dimension of Metric Balls under {F}r\'{e}chet and {H}ausdorff Distances

    Get PDF

    Fine-grained complexity and algorithm engineering of geometric similarity measures

    Get PDF
    Point sets and sequences are fundamental geometric objects that arise in any application that considers movement data, geometric shapes, and many more. A crucial task on these objects is to measure their similarity. Therefore, this thesis presents results on algorithms, complexity lower bounds, and algorithm engineering of the most important point set and sequence similarity measures like the Fréchet distance, the Fréchet distance under translation, and the Hausdorff distance under translation. As an extension to the mere computation of similarity, also the approximate near neighbor problem for the continuous Fréchet distance on time series is considered and matching upper and lower bounds are shown.Punktmengen und Sequenzen sind fundamentale geometrische Objekte, welche in vielen Anwendungen auftauchen, insbesondere in solchen die Bewegungsdaten, geometrische Formen, und ähnliche Daten verarbeiten. Ein wichtiger Bestandteil dieser Anwendungen ist die Berechnung der Ähnlichkeit von Objekten. Diese Dissertation präsentiert Resultate, genauer gesagt Algorithmen, untere Komplexitätsschranken und Algorithm Engineering der wichtigsten Ähnlichkeitsmaße für Punktmengen und Sequenzen, wie zum Beispiel Fréchetdistanz, Fréchetdistanz unter Translation und Hausdorffdistanz unter Translation. Als eine Erweiterung der bloßen Berechnung von Ähnlichkeit betrachten wir auch das Near Neighbor Problem für die kontinuierliche Fréchetdistanz auf Zeitfolgen und zeigen obere und untere Schranken dafür
    corecore