36 research outputs found

    Parallelized Particle and Gaussian Sum Particle Filters for Large Scale Freeway Traffic Systems

    Get PDF
    Large scale traffic systems require techniques able to: 1) deal with high amounts of data and heterogenous data coming from different types of sensors, 2) provide robustness in the presence of sparse sensor data, 3) incorporate different models that can deal with various traffic regimes, 4) cope with multimodal conditional probability density functions for the states. Often centralized architectures face challenges due to high communication demands. This paper develops new estimation techniques able to cope with these problems of large traffic network systems. These are Parallelized Particle Filters (PPFs) and a Parallelized Gaussian Sum Particle Filter (PGSPF) that are suitable for on-line traffic management. We show how complex probability density functions of the high dimensional trafc state can be decomposed into functions with simpler forms and the whole estimation problem solved in an efcient way. The proposed approach is general, with limited interactions which reduces the computational time and provides high estimation accuracy. The efciency of the PPFs and PGSPFs is evaluated in terms of accuracy, complexity and communication demands and compared with the case where all processing is centralized

    Particle filter state estimator for large urban networks

    Get PDF
    This paper applies a particle filter (PF) state estimator to urban traffic networks. The traffic network consists of signalized intersections, the roads that link these intersections, and sensors that detect the passage time of vehicles. The traffic state X(t) specifies at each time time t the state of the traffic lights, the queue sizes at the intersections, and the location and size of all the platoons of vehicles inside the system. The basic entity of our model is a platoon of vehicles that travel close together at approximately the same speed. This leads to a discrete event simulation model that is much faster than microscopic models representing individual vehicles. Hence it is possible to execute many random simulation runs in parallel. A particle filter (PF) assigns weights to each of these simulation runs, according to how well they explain the observed sensor signals. The PF thus generates estimates at each time t of the location of the platoons, and more importantly the queue size at each intersection. These estimates can be used for controlling the optimal switching times of the traffic light

    Particle filter for platoon based models of urban traffic

    Get PDF
    This paper proposes a particle filter (PF) state estimator, using a platoon based model for urban traffic networks. The urban traffic network model consists of signalized intersections (representing queues of vehicles competing for service) connected to each other through links with predefined receiving capacities and stochastic delays. Sensors detect the passage of vehicles at the sensor locations. The algorithm is flexible and robust and can be used in real-time applications such as on-line control of switching times of traffic lights

    Sensor Data Fusion for Improving Traffic Mobility in Smart Cities

    Get PDF
    The ever-increasing urban population and vehicular traffic without a corresponding expansion of infrastructure have been a challenge to transportation facilities managers and commuters. While some parts of transportation infrastructure have big data available, so many other locations have sparse data. This has posed a challenge in traffic state estimation and prediction for efficient and effective infrastructure management and route guidance. This research focused on traffic prediction problems and aims to develop novel spatial-temporal and robust algorithms, that can provide high accuracy in the presence of both big data and sparse data in a large urban road network. Intelligent transportation systems require the knowledge of current traffic state and forecast for effective implementation. The actual traffic state has to be estimated as the existing sensors do not capture the needed state. Sensor measurements often contain missing or incomplete data as a result of communication issues, faulty sensors or cost leading to incomplete monitoring of the entire road network. This missing data pose challenges to traffic estimation approaches. In this work, a robust spatio-temporal traffic imputation approach capable of withstanding high missing data rate is presented. A particle-based approach with Kriging interpolation is proposed. The performance of the particle-based Kriging interpolation for different missing data ratios was investigated for a large road network. A particle-based framework for dealing with missing data is also proposed. An expression of the likelihood function is derived for the case when the missing value is calculated based on Kriging interpolation. With the Kriging interpolation, the missing values of the measurements are predicted, which are subsequently used in the computation of likelihood terms in the particle filter algorithm. In the commonly used Kriging approaches, the covariance function depends only on the separation distance irrespective of the traffic at the considered locations. A key limitation of such an approach is its inability to capture well the traffic dynamics and transitions between different states. This thesis proposes a Bayesian Kriging approach for the prediction of urban traffic. The approach can capture these dynamics and model changes via the covariance matrix. The main novelty consists in representing both stationary and non-stationary changes in traffic flows by a discriminative covariance function conditioned on the observation at each location. An advantage is that by considering the surrounding traffic information distinctively, the proposed method is very likely to represent congested regions and interactions in both upstream and downstream areas

    Eco-friendly Naturalistic Vehicular Sensing and Driving Behaviour Profiling

    Get PDF
    PhD ThesisInternet of Things (IoT) technologies are spurring of serious games that support training directly in the field. This PhD implements field user performance evaluators usable in reality-enhanced serious games (RESGs) for promoting fuel-efficient driving. This work proposes two modules – that have been implemented by processing information related to fuel-efficient driving – to be employed as real-time virtual sensors in RESGS. The first module estimates and assesses instantly fuel consumption, where I compared the performance of three configured machine learning algorithms, support vector regression, random forest and artificial neural networks. The experiments show that the algorithms have similar performance and random forest slightly outperforms the others. The second module provides instant recommendations using fuzzy logic when inefficient driving patterns are detected. For the game design, I resorted to the on-board diagnostics II standard interface to diagnostic circulating information on vehicular buses for a wide diffusion of a game, avoiding sticking to manufacturer proprietary solutions. The approach has been implemented and tested with data from the enviroCar server site. The data is not calibrated for a specific car model and is recorded in different driving environments, which made the work challenging and robust for real-world conditions. The proposed approach to virtual sensor design is general and thus applicable to various application domains other than fuel-efficient driving. An important word of caution concerns users’ privacy, as the modules rely on sensitive data, and provide information that by no means should be misused

    Parallelised Gaussian mixture filtering for vehicular traffic flow estimation.

    Get PDF
    Large traffic network systems require handling huge amounts of data, often distributed over a large geographical region in space and time. Centralised processing is not then the right choice in such cases. In this paper we develop a parallelised Gaussian Mixture Model filter (GMMF) for traffic networks aimed to: 1) work with high amounts of data and heterogenous data (from different sensor modalities), 2) provide robustness in the presence of sparse and missing sensor data, 3) able to incorporate different models in different traffic segments and represent various traffic regimes, 4) able to cope with multimodalities (e.g., due to multimodal measurement likelihood or multimodal state probability density functions). The efficiency of the parallelised GMMF is investigated over traffic flows based on macroscopic modelling and compared with a centralised GMMF. The proposed GMM approach is general, it is applicable to systems where the overall state vector can be partitioned into state components (subsets), corresponding to certain geographical regions, such that most of the interactions take place within the subsets. The performance of the paralellised and centralised GMMFs is investigated and evaluated in terms of accuracy and complexity

    Pertanika Journal of Science & Technology

    Get PDF

    Pertanika Journal of Science & Technology

    Get PDF

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
    corecore