524 research outputs found

    Multisensor-based human detection and tracking for mobile service robots

    Get PDF
    The one of fundamental issues for service robots is human-robot interaction. In order to perform such a task and provide the desired services, these robots need to detect and track people in the surroundings. In the present paper, we propose a solution for human tracking with a mobile robot that implements multisensor data fusion techniques. The system utilizes a new algorithm for laser-based legs detection using the on-board LRF. The approach is based on the recognition of typical leg patterns extracted from laser scans, which are shown to be very discriminative also in cluttered environments. These patterns can be used to localize both static and walking persons, even when the robot moves. Furthermore, faces are detected using the robot's camera and the information is fused to the legs position using a sequential implementation of Unscented Kalman Filter. The proposed solution is feasible for service robots with a similar device configuration and has been successfully implemented on two different mobile platforms. Several experiments illustrate the effectiveness of our approach, showing that robust human tracking can be performed within complex indoor environments

    A bank of unscented Kalman filters for multimodal human perception with mobile service robots

    Get PDF
    A new generation of mobile service robots could be ready soon to operate in human environments if they can robustly estimate position and identity of surrounding people. Researchers in this field face a number of challenging problems, among which sensor uncertainties and real-time constraints. In this paper, we propose a novel and efficient solution for simultaneous tracking and recognition of people within the observation range of a mobile robot. Multisensor techniques for legs and face detection are fused in a robust probabilistic framework to height, clothes and face recognition algorithms. The system is based on an efficient bank of Unscented Kalman Filters that keeps a multi-hypothesis estimate of the person being tracked, including the case where the latter is unknown to the robot. Several experiments with real mobile robots are presented to validate the proposed approach. They show that our solutions can improve the robot's perception and recognition of humans, providing a useful contribution for the future application of service robotics

    Computationally efficient solutions for tracking people with a mobile robot: an experimental evaluation of Bayesian filters

    Get PDF
    Modern service robots will soon become an essential part of modern society. As they have to move and act in human environments, it is essential for them to be provided with a fast and reliable tracking system that localizes people in the neighbourhood. It is therefore important to select the most appropriate filter to estimate the position of these persons. This paper presents three efficient implementations of multisensor-human tracking based on different Bayesian estimators: Extended Kalman Filter (EKF), Unscented Kalman Filter (UKF) and Sampling Importance Resampling (SIR) particle filter. The system implemented on a mobile robot is explained, introducing the methods used to detect and estimate the position of multiple people. Then, the solutions based on the three filters are discussed in detail. Several real experiments are conducted to evaluate their performance, which is compared in terms of accuracy, robustness and execution time of the estimation. The results show that a solution based on the UKF can perform as good as particle filters and can be often a better choice when computational efficiency is a key issue

    A Modified Bayesian Framework for Multi-Sensor Target Tracking with Out-of-Sequence-Measurements

    Get PDF
    Target detection and tracking is important in military as well as in civilian applications. In order to detect and track high-speed incoming threats, modern surveillance systems are equipped with multiple sensors to overcome the limitations of single-sensor based tracking systems. This research proposes the use of information from RADAR and Infrared sensors (IR) for tracking and estimating target state dynamics. A new technique is developed for information fusion of the two sensors in a way that enhances performance of the data association algorithm. The measurement acquisition and processing time of these sensors is not the same; consequently the fusion center measurements arrive out of sequence. To ensure the practicality of system, proposed algorithm compensates the Out of Sequence Measurements (OOSMs) in cluttered environment. This is achieved by a novel algorithm which incorporates a retrodiction based approach to compensate the effects of OOSMs in a modified Bayesian technique. The proposed modification includes a new gating strategy to fuse and select measurements from two sensors which originate from the same target. The state estimation performance is evaluated in terms of Root Mean Squared Error (RMSE) for both position and velocity, whereas, track retention statistics are evaluated to gauge the performance of the proposed tracking algorithm. The results clearly show that the proposed technique improves track retention and and false track discrimination (FTD)

    Asynchronous sensor fusion of GPS, IMU and CAN-based odometry for heavy-duty vehicles

    Get PDF
    In heavy-duty vehicles, multiple signals are available to estimate the vehicle's kinematics, such as Inertial Measurement Unit (IMU), Global Positioning System (GPS) and linear and angular speed readings from wheel tachometers on the internal Controller Area Network (CAN). These signals have different noise variance, bandwidth and sampling rate (being the latter, possibly, irregular). In this paper we present a non-linear sensor fusion algorithm allowing asynchronous sampling and non-causal smoothing. It is applied to achieve accuracy improvements when incorporating odometry measurements from CAN bus to standard GPS+IMU kinematic estimation, as well as the robustness against missing data. Our results show that this asynchronous multi-sensor (GPS+IMU+CAN-based odometry) fusion is advantageous in low-speed manoeuvres, improving accuracy and robustness to missing data, thanks to non-causal filtering. The proposed algorithm is based on Extended Kalman Filter and Smoother, with exponential discretization of continuous-time stochastic differential equations, in order to process measurements at arbitrary time instants; it can provide data to subsequent processing steps at arbitrary time instants, not necessarily coincident with the original measurement ones. Given the extra information available in the smoothing case, its estimation performance is less sensitive to the noise-variance parameter setting, compared to causal filtering. Working Matlab code is provided at the end of this work

    Radar networks: A review of features and challenges

    Full text link
    Networks of multiple radars are typically used for improving the coverage and tracking accuracy. Recently, such networks have facilitated deployment of commercial radars for civilian applications such as healthcare, gesture recognition, home security, and autonomous automobiles. They exploit advanced signal processing techniques together with efficient data fusion methods in order to yield high performance of event detection and tracking. This paper reviews outstanding features of radar networks, their challenges, and their state-of-the-art solutions from the perspective of signal processing. Each discussed subject can be evolved as a hot research topic.Comment: To appear soon in Information Fusio
    corecore