6,993 research outputs found
Vehicular Fog Computing Enabled Real-time Collision Warning via Trajectory Calibration
Vehicular fog computing (VFC) has been envisioned as a promising paradigm for
enabling a variety of emerging intelligent transportation systems (ITS).
However, due to inevitable as well as non-negligible issues in wireless
communication, including transmission latency and packet loss, it is still
challenging in implementing safety-critical applications, such as real-time
collision warning in vehicular networks. In this paper, we present a vehicular
fog computing architecture, aiming at supporting effective and real-time
collision warning by offloading computation and communication overheads to
distributed fog nodes. With the system architecture, we further propose a
trajectory calibration based collision warning (TCCW) algorithm along with
tailored communication protocols. Specifically, an application-layer
vehicular-to-infrastructure (V2I) communication delay is fitted by the Stable
distribution with real-world field testing data. Then, a packet loss detection
mechanism is designed. Finally, TCCW calibrates real-time vehicle trajectories
based on received vehicle status including GPS coordinates, velocity,
acceleration, heading direction, as well as the estimation of communication
delay and the detection of packet loss. For performance evaluation, we build
the simulation model and implement conventional solutions including cloud-based
warning and fog-based warning without calibration for comparison. Real-vehicle
trajectories are extracted as the input, and the simulation results demonstrate
that the effectiveness of TCCW in terms of the highest precision and recall in
a wide range of scenarios
Robust visual odometry using uncertainty models
In dense, urban environments, GPS by itself cannot be relied on to provide accurate positioning information. Signal reception issues (e.g. occlusion, multi-path effects) often prevent the GPS receiver from getting a positional lock, causing holes in the absolute positioning data. In order to keep assisting the driver, other sensors are required to track the vehicle motion during these periods of GPS disturbance. In this paper, we propose a novel method to use a single on-board consumer-grade camera to estimate the relative vehicle motion. The method is based on the tracking of ground plane features, taking into account the uncertainty on their backprojection as well as the uncertainty on the vehicle motion. A Hough-like parameter space vote is employed to extract motion parameters from the uncertainty models. The method is easy to calibrate and designed to be robust to outliers and bad feature quality. Preliminary testing shows good accuracy and reliability, with a positional estimate within 2 metres for a 400 metre elapsed distance. The effects of inaccurate calibration are examined using artificial datasets, suggesting a self-calibrating system may be possible in future work
Simultaneous Parameter Calibration, Localization, and Mapping
The calibration parameters of a mobile robot play a substantial role in navigation tasks. Often these parameters are subject to variations that depend either on changes in the environment or on the load of the robot. In this paper, we propose an approach to simultaneously estimate a map of the environment, the position of the on-board sensors of the robot, and its kinematic parameters. Our method requires no prior knowledge about the environment and relies only on a rough initial guess of the parameters of the platform. The proposed approach estimates the parameters online and it is able to adapt to non-stationary changes of the configuration. We tested our approach in simulated environments and on a wide range of real-world data using different types of robotic platforms. (C) 2012 Taylor & Francis and The Robotics Society of Japa
- …