102 research outputs found

    On the complexity of range searching among curves

    Full text link
    Modern tracking technology has made the collection of large numbers of densely sampled trajectories of moving objects widely available. We consider a fundamental problem encountered when analysing such data: Given nn polygonal curves SS in Rd\mathbb{R}^d, preprocess SS into a data structure that answers queries with a query curve qq and radius ρ\rho for the curves of SS that have \Frechet distance at most ρ\rho to qq. We initiate a comprehensive analysis of the space/query-time trade-off for this data structuring problem. Our lower bounds imply that any data structure in the pointer model model that achieves Q(n)+O(k)Q(n) + O(k) query time, where kk is the output size, has to use roughly Ω((n/Q(n))2)\Omega\left((n/Q(n))^2\right) space in the worst case, even if queries are mere points (for the discrete \Frechet distance) or line segments (for the continuous \Frechet distance). More importantly, we show that more complex queries and input curves lead to additional logarithmic factors in the lower bound. Roughly speaking, the number of logarithmic factors added is linear in the number of edges added to the query and input curve complexity. This means that the space/query time trade-off worsens by an exponential factor of input and query complexity. This behaviour addresses an open question in the range searching literature: whether it is possible to avoid the additional logarithmic factors in the space and query time of a multilevel partition tree. We answer this question negatively. On the positive side, we show we can build data structures for the \Frechet distance by using semialgebraic range searching. Our solution for the discrete \Frechet distance is in line with the lower bound, as the number of levels in the data structure is O(t)O(t), where tt denotes the maximal number of vertices of a curve. For the continuous \Frechet distance, the number of levels increases to O(t2)O(t^2)

    Vehicle path verification using wireless sensor networks

    Get PDF
    Path Verification is a problem where a verifier would like to determine how closely a vehicle actually traversed a path that it claims to have traversed. This problem has critical significances in terms of vehicle mobility. Mobile nodes can be patrols officers or cab drivers, while respective verifiers can be police dispatchers or cab operators. In this paper, we design a sensor network assisted technique for vehicle path verification. In our design, a number of static wireless sensors placed in road segments will serve as witnesses and certify vehicles as they move. Post movement, these witness certificates will be utilized by the verifier to derive the actual path of a suspect vehicle. The challenge now is how to compare a Claimed Path as reported by the vehicle and the Actual Path derived from witness certificates. In this paper, we design a simple, yet effective technique for comparing similarity between two vehicle paths. Our technique extends from Continuous Dynamic Time Warping, which involves constructing a universal manifold from the two paths and then finding the geodesic on the resulting polygonal surface (shortest path along the surface) which is a diagonal from the origin of the surface to the terminal point. This distance is analogous to the Fréchet distance and yields a good measure of the similarity between two paths. Using simulations and real experiments, we demonstrate the performance of our technique from the perspective of detecting false paths claims from correct ones. We also design light-weight cryptographic techniques to prevent vehicle masquerading and certificate forging attacks. A proof of concept experiment was conducted on the streets of Rolla, Missouri. A sensor grid was established on a small section of Rolla and a vehicle with a transmitter was driven through the grid many times. The analysis of the data yielded results consistent with the expected ones --Abstract, page iii

    Four Soviets Walk the Dog-Improved Bounds for Computing the Fr\'echet Distance

    Get PDF
    Given two polygonal curves in the plane, there are many ways to define a notion of similarity between them. One popular measure is the Fr\'echet distance. Since it was proposed by Alt and Godau in 1992, many variants and extensions have been studied. Nonetheless, even more than 20 years later, the original O(n2log⁥n)O(n^2 \log n) algorithm by Alt and Godau for computing the Fr\'echet distance remains the state of the art (here, nn denotes the number of edges on each curve). This has led Helmut Alt to conjecture that the associated decision problem is 3SUM-hard. In recent work, Agarwal et al. show how to break the quadratic barrier for the discrete version of the Fr\'echet distance, where one considers sequences of points instead of polygonal curves. Building on their work, we give a randomized algorithm to compute the Fr\'echet distance between two polygonal curves in time O(n2log⁥n(log⁥log⁥n)3/2)O(n^2 \sqrt{\log n}(\log\log n)^{3/2}) on a pointer machine and in time O(n2(log⁥log⁥n)2)O(n^2(\log\log n)^2) on a word RAM. Furthermore, we show that there exists an algebraic decision tree for the decision problem of depth O(n2−Δ)O(n^{2-\varepsilon}), for some Δ>0\varepsilon > 0. We believe that this reveals an intriguing new aspect of this well-studied problem. Finally, we show how to obtain the first subquadratic algorithm for computing the weak Fr\'echet distance on a word RAM.Comment: 34 pages, 15 figures. A preliminary version appeared in SODA 201

    Pedestrian Dead-Reckoning Algorithms For Dual Foot-Mounted Inertial Sensors

    Full text link
    This work proposes algorithms for reconstruction of closed-loop pedestrian trajectories based on two foot-mounted inertial measurement units (IMU). The first proposed algorithm allows calculation of a trajectory using measurements from only one IMU. The second algorithm uses data from both foot-mounted IMUs simultaneously. Both algorithms are based on the Kalman filter and the assumption that while a foot is on the ground its velocity is supposed to be zero. Two methods for comparing the obtained trajectories are proposed, advantages and disadvantages of each method are indicated and a way to optimize the computation time is presented. In addition, a method is proposed for constructing one generalized trajectory of human motion based on the trajectories of each leg.Comment: The data used in the article are available for downloading at http://gartseev.ru/projects/mkins201

    Four Soviets walk the dog, with an application to Alt's conjecture

    Get PDF
    Given two polygonal curves in the plane, there are many ways to define a notion of similarity between them. One measure that is extremely popular is the Fréchet distance. Since it has been proposed by Alt and Godau in 1992, many variants and extensions have been studied. Nonetheless, even more than 20 years later, the original O(n^2 log n) algorithm by Alt and Godau for computing the Fréchet distance remains the state of the art (here n denotes the number of vertices on each curve). This has led Helmut Alt to conjecture that the associated decision problem is 3SUM-hard.In recent work, Agarwal et al. show how to break the quadratic barrier for the discrete version of the Fréchet distance, where one considers sequences of points instead of polygonal curves. Building on their work, we give a randomized algorithm to compute the Fréchet distance between two polygonal curves in time O(n^2 \sqrt log n (log log n)^{3/2}) on a pointer machine and in time O(n^2 (log log n)^2) on a word RAM. Furthermore, we show that there exists an algebraic decision tree for the decision problem of depth O(n^{2¿}), for some ¿ > 0. This provides evidence that the decision problem may not be 3SUM-hard after all and reveals an intriguing new aspect of this well-studied problem

    Partial shape matching using CCP map and weighted graph transformation matching

    Get PDF
    La dĂ©tection de la similaritĂ© ou de la diffĂ©rence entre les images et leur mise en correspondance sont des problĂšmes fondamentaux dans le traitement de l'image. Pour rĂ©soudre ces problĂšmes, on utilise, dans la littĂ©rature, diffĂ©rents algorithmes d'appariement. MalgrĂ© leur nouveautĂ©, ces algorithmes sont pour la plupart inefficaces et ne peuvent pas fonctionner correctement dans les situations d’images bruitĂ©es. Dans ce mĂ©moire, nous rĂ©solvons la plupart des problĂšmes de ces mĂ©thodes en utilisant un algorithme fiable pour segmenter la carte des contours image, appelĂ©e carte des CCPs, et une nouvelle mĂ©thode d'appariement. Dans notre algorithme, nous utilisons un descripteur local qui est rapide Ă  calculer, est invariant aux transformations affines et est fiable pour des objets non rigides et des situations d’occultation. AprĂšs avoir trouvĂ© le meilleur appariement pour chaque contour, nous devons vĂ©rifier si ces derniers sont correctement appariĂ©s. Pour ce faire, nous utilisons l'approche « Weighted Graph Transformation Matching » (WGTM), qui est capable d'Ă©liminer les appariements aberrants en fonction de leur proximitĂ© et de leurs relations gĂ©omĂ©triques. WGTM fonctionne correctement pour les objets Ă  la fois rigides et non rigides et est robuste aux distorsions importantes. Pour Ă©valuer notre mĂ©thode, le jeu de donnĂ©es ETHZ comportant cinq classes diffĂ©rentes d'objets (bouteilles, cygnes, tasses, girafes, logos Apple) est utilisĂ©. Enfin, notre mĂ©thode est comparĂ©e Ă  plusieurs mĂ©thodes cĂ©lĂšbres proposĂ©es par d'autres chercheurs dans la littĂ©rature. Bien que notre mĂ©thode donne un rĂ©sultat comparable Ă  celui des mĂ©thodes de rĂ©fĂ©rence en termes du rappel et de la prĂ©cision de localisation des frontiĂšres, elle amĂ©liore significativement la prĂ©cision moyenne pour toutes les catĂ©gories du jeu de donnĂ©es ETHZ.Matching and detecting similarity or dissimilarity between images is a fundamental problem in image processing. Different matching algorithms are used in literature to solve this fundamental problem. Despite their novelty, these algorithms are mostly inefficient and cannot perform properly in noisy situations. In this thesis, we solve most of the problems of previous methods by using a reliable algorithm for segmenting image contour map, called CCP Map, and a new matching method. In our algorithm, we use a local shape descriptor that is very fast, invariant to affine transform, and robust for dealing with non-rigid objects and occlusion. After finding the best match for the contours, we need to verify if they are correctly matched. For this matter, we use the Weighted Graph Transformation Matching (WGTM) approach, which is capable of removing outliers based on their adjacency and geometrical relationships. WGTM works properly for both rigid and non-rigid objects and is robust to high order distortions. For evaluating our method, the ETHZ dataset including five diverse classes of objects (bottles, swans, mugs, giraffes, apple-logos) is used. Finally, our method is compared to several famous methods proposed by other researchers in the literature. While our method shows a comparable result to other benchmarks in terms of recall and the precision of boundary localization, it significantly improves the average precision for all of the categories in the ETHZ dataset
    • 

    corecore