15 research outputs found

    Adaptive Douglas-Peucker Algorithm With Automatic Thresholding for AIS-Based Vessel Trajectory Compression

    Get PDF
    Automatic identification system (AIS) is an important part of perfecting terrestrial networks, radar systems and satellite constellations. It has been widely used in vessel traffic service system to improve navigational safety. Following the explosion in vessel AIS data, the issues of data storing, processing, and analysis arise as emerging research topics in recent years. Vessel trajectory compression is used to eliminate the redundant information, preserve the key features, and simplify information for further data mining, thus correspondingly improving data quality and guaranteeing accurate measurement for ensuring navigational safety. It is well known that trajectory compression quality significantly depends on the threshold selection. We propose an Adaptive Douglas-Peucker (ADP) algorithm with automatic thresholding for AIS-based vessel trajectory compression. In particular, the optimal threshold is adaptively calculated using a novel automatic threshold selection method for each trajectory, as an improvement and complement of original Douglas-Peucker (DP) algorithm. It is developed based on the channel and trajectory characteristics, segmentation framework, and mean distance. The proposed method is able to simplify vessel trajectory data and extract useful information effectively. The time series trajectory classification and clustering are discussed and analysed based on ADP algorithm in this paper. To verify the reasonability and effectiveness of the proposed method, experiments are conducted on two different trajectory data sets in inland waterway of Yangtze River for trajectory classification based on the nearest neighbor classifier, and for trajectory clustering based on the spectral clustering. Comprehensive results demonstrate that the proposed algorithm can reduce the computational cost while ensuring the clustering and classification accuracy

    A Comparison Between Alignment and Integral Based Kernels for Vessel Trajectories

    Get PDF
    In this paper we present a comparison between two important types of similarity measures for moving object trajectories for machine learning from vessel movement data. These similarities are compared in the tasks of clustering, classication and outlier detection. The rst similarity type are alignment measures, such as dynamic time warping and edit distance. The second type are based on the integral over time between two trajectories. Following earlier work we dene these measures in the context of kernel methods, which provide state-of-the-art, robust algorithms for the tasks studied. Furthermore, we include the in uence of applying piecewise linear segmentation as pre-processing to the vessel trajectories when computing alignment measures, since this has been shown to give a positive eect in computation time and performance. In our experiments the alignment based measures show the best performance. Regular versions of edit distance give the best performance in clustering and classication, whereas the softmax variant of dynamic time warping works best in outlier detection. Moreover, piecewise linear segmentation has a positive eect on alignments, which seems to be due to the fact salient points in a trajectory, especially important in clustering and outlier detection, are highlighted by the segmentation and have a large in uence in the alignments

    Finding long and similar parts of trajectories

    Full text link
    A natural time-dependent similarity measure for two trajectories is their average distance at corresponding times. We give algorithms for computing the most similar subtrajectories under this measure, assuming the two trajectories are given as two polygonal, possibly self-intersecting lines. When a minimum duration is specified for the subtrajectories, and they must start at exactly corresponding times in the input trajectories, we give a linear-time algorithm for computing the starting time and duration of the most similar subtrajectories. The algorithm is based on a result of independent interest: We present a linear-time algorithm to find, for a piece-wise monotone function, an interval of at least a given length that has minimum average value. When the two subtrajectories can start at different times in the two input trajectories, it appears difficult to give an exact algorithm for the most similar subtrajectories problem, even if the duration of the desired two subtrajectories is fixed to some length. We show that the problem can be solved approximately, and with a performance guarantee. More precisely, we present (1 + e)-approximation algorithms for computing the most similar subtrajectories of two input trajectories for the case where the duration is specified, and also for the case where only a minimum on the duration is specified

    An Overview of Moving Object Trajectory Compression Algorithms

    Get PDF
    Compression technology is an efficient way to reserve useful and valuable data as well as remove redundant and inessential data from datasets. With the development of RFID and GPS devices, more and more moving objects can be traced and their trajectories can be recorded. However, the exponential increase in the amount of such trajectory data has caused a series of problems in the storage, processing, and analysis of data. Therefore, moving object trajectory compression undoubtedly becomes one of the hotspots in moving object data mining. To provide an overview, we survey and summarize the development and trend of moving object compression and analyze typical moving object compression algorithms presented in recent years. In this paper, we firstly summarize the strategies and implementation processes of classical moving object compression algorithms. Secondly, the related definitions about moving objects and their trajectories are discussed. Thirdly, the validation criteria are introduced for evaluating the performance and efficiency of compression algorithms. Finally, some application scenarios are also summarized to point out the potential application in the future. It is hoped that this research will serve as the steppingstone for those interested in advancing moving objects mining

    A heuristics based global navigation satellite system data reduction algorithm integrated with map-matching

    Get PDF
    The transmission and storage of global navigation satellite system (GNSS) data places very high demands on mobile networks and centralised data processing systems. GNSS applications including community based navigation and fleet management require GNSS data to be transmitted from a vehicle to a centralised system and then processed by a map-matching algorithm to determine the location of a vehicle within a road segment. Various data compression techniques have been developed to reduce the volume of data transmitted. There is also an independent literature relating to map-matching algorithms. However, no previous research has integrated data compression with a map-matching algorithm that accepts compressed data as an input without the need for decompression. This paper develops a novel GNSS data reduction algorithm with deterministic error bounds, which was seamless integrated with a specifically designed map-matching algorithm. The approach significantly reduces the volume of GNSS data communicated and improves the performance of the map-matching algorithm. The data compression extracts critical points in the trajectory and velocity–time curve of a vehicle. During the process of selecting critical points, the error of restoring vehicle trajectories and velocity–time curves are used as parameters to control the number of critical points selected. By setting different error bound values prior to the execution of the algorithm, the accuracy and volume of reduced data is controlled precisely. The compressed GNSS data, particularly the critical points selected from the vehicle’s trajectory is directly input to the map-matching algorithm without the need for decompression. An experiment indicated that the data reduction algorithm is very effective in reducing data volume. This research will be useful in many fields including community driven navigation and fleet management

    Web-based Geographical Visualization of Container Itineraries

    Get PDF
    Around 90% of the world cargo is transported in maritime containers, but only around 2% are physically inspected. This opens the possibility for illicit activities. A viable solution is to control containerized cargo through information-based risk analysis. Container route-based analysis has been considered a key factor in identifying potentially suspicious consignments. Essential part of itinerary analysis is the geographical visualization of the itinerary. In the present paper, we present initial work of a web-based system’s realization for interactive geographical visualization of container itinerary.JRC.G.4-Maritime affair

    Algorithms for Imprecise Trajectories

    Get PDF
    corecore