3 research outputs found

    Speeding up Simplification of Polygonal Curves using Nested Approximations

    Full text link
    We develop a multiresolution approach to the problem of polygonal curve approximation. We show theoretically and experimentally that, if the simplification algorithm A used between any two successive levels of resolution satisfies some conditions, the multiresolution algorithm MR will have a complexity lower than the complexity of A. In particular, we show that if A has a O(N2/K) complexity (the complexity of a reduced search dynamic solution approach), where N and K are respectively the initial and the final number of segments, the complexity of MR is in O(N).We experimentally compare the outcomes of MR with those of the optimal "full search" dynamic programming solution and of classical merge and split approaches. The experimental evaluations confirm the theoretical derivations and show that the proposed approach evaluated on 2D coastal maps either shows a lower complexity or provides polygonal approximations closer to the initial curves.Comment: 12 pages + figure

    Tabu split and merge for the simplification of polygonal curves

    Full text link

    Time Warp Edit Distance with Stiffness Adjustment for Time Series Matching

    Full text link
    In a way similar to the string-to-string correction problem we address time series similarity in the light of a time-series-to-time-series-correction problem for which the similarity between two time series is measured as the minimum cost sequence of "edit operations" needed to transform one time series into another. To define the "edit operations" we use the paradigm of a graphical editing process and end up with a dynamic programming algorithm that we call Time Warp Edit Distance (TWED). TWED is slightly different in form from Dynamic Time Warping, Longest Common Subsequence or Edit Distance with Real Penalty algorithms. In particular, it highlights a parameter which drives a kind of stiffness of the elastic measure along the time axis. We show that the similarity provided by TWED is a metric potentially useful in time series retrieval applications since it could benefit from the triangular inequality property to speed up the retrieval process while tuning the parameters of the elastic measure. In that context, a lower bound is derived to relate the matching of time series into down sampled representation spaces to the matching into the original space. Empiric quality of the TWED distance is evaluated on a simple classification task. Compared to Edit Distance, Dynamic Time Warping, Longest Common Subsequnce and Edit Distance with Real Penalty, TWED has proven to be quite effective on the considered experimental task
    corecore