13,815 research outputs found

    The path inference filter: model-based low-latency map matching of probe vehicle data

    Full text link
    We consider the problem of reconstructing vehicle trajectories from sparse sequences of GPS points, for which the sampling interval is between 10 seconds and 2 minutes. We introduce a new class of algorithms, called altogether path inference filter (PIF), that maps GPS data in real time, for a variety of trade-offs and scenarios, and with a high throughput. Numerous prior approaches in map-matching can be shown to be special cases of the path inference filter presented in this article. We present an efficient procedure for automatically training the filter on new data, with or without ground truth observations. The framework is evaluated on a large San Francisco taxi dataset and is shown to improve upon the current state of the art. This filter also provides insights about driving patterns of drivers. The path inference filter has been deployed at an industrial scale inside the Mobile Millennium traffic information system, and is used to map fleets of data in San Francisco, Sacramento, Stockholm and Porto.Comment: Preprint, 23 pages and 23 figure

    Modelling network travel time reliability under stochastic demand

    Get PDF
    A technique is proposed for estimating the probability distribution of total network travel time, in the light of normal day-to-day variations in the travel demand matrix over a road traffic network. A solution method is proposed, based on a single run of a standard traffic assignment model, which operates in two stages. In stage one, moments of the total travel time distribution are computed by an analytic method, based on the multivariate moments of the link flow vector. In stage two, a flexible family of density functions is fitted to these moments. It is discussed how the resulting distribution may in practice be used to characterise unreliability. Illustrative numerical tests are reported on a simple network, where the method is seen to provide a means for identifying sensitive or vulnerable links, and for examining the impact on network reliability of changes to link capacities. Computational considerations for large networks, and directions for further research, are discussed

    Fusing Loop and GPS Probe Measurements to Estimate Freeway Density

    Full text link
    In an age of ever-increasing penetration of GPS-enabled mobile devices, the potential of real-time "probe" location information for estimating the state of transportation networks is receiving increasing attention. Much work has been done on using probe data to estimate the current speed of vehicle traffic (or equivalently, trip travel time). While travel times are useful to individual drivers, the state variable for a large class of traffic models and control algorithms is vehicle density. Our goal is to use probe data to supplement traditional, fixed-location loop detector data for density estimation. To this end, we derive a method based on Rao-Blackwellized particle filters, a sequential Monte Carlo scheme. We present a simulation where we obtain a 30\% reduction in density mean absolute percentage error from fusing loop and probe data, vs. using loop data alone. We also present results using real data from a 19-mile freeway section in Los Angeles, California, where we obtain a 31\% reduction. In addition, our method's estimate when using only the real-world probe data, and no loop data, outperformed the estimate produced when only loop data were used (an 18\% reduction). These results demonstrate that probe data can be used for traffic density estimation

    An Efficient Monte Carlo-based Probabilistic Time-Dependent Routing Calculation Targeting a Server-Side Car Navigation System

    Full text link
    Incorporating speed probability distribution to the computation of the route planning in car navigation systems guarantees more accurate and precise responses. In this paper, we propose a novel approach for dynamically selecting the number of samples used for the Monte Carlo simulation to solve the Probabilistic Time-Dependent Routing (PTDR) problem, thus improving the computation efficiency. The proposed method is used to determine in a proactive manner the number of simulations to be done to extract the travel-time estimation for each specific request while respecting an error threshold as output quality level. The methodology requires a reduced effort on the application development side. We adopted an aspect-oriented programming language (LARA) together with a flexible dynamic autotuning library (mARGOt) respectively to instrument the code and to take tuning decisions on the number of samples improving the execution efficiency. Experimental results demonstrate that the proposed adaptive approach saves a large fraction of simulations (between 36% and 81%) with respect to a static approach while considering different traffic situations, paths and error requirements. Given the negligible runtime overhead of the proposed approach, it results in an execution-time speedup between 1.5x and 5.1x. This speedup is reflected at infrastructure-level in terms of a reduction of around 36% of the computing resources needed to support the whole navigation pipeline
    • …
    corecore