5,024 research outputs found

    Sensor Data Fusion using Unscented Kalman Filter for VOR-based Vision Tracking System for Mobile Robots

    Get PDF
    This paper presents sensor data fusion using Unscented Kalman Filter (UKF) to implement high performance vestibulo-ocular reflex (VOR) based vision tracking system for mobile robots. Information from various sensors is required to be integrated using an efficient sensor fusion algorithm to achieve a continuous and robust vision tracking system. We use data from low cost accelerometer, gyroscope, and encoders to calculate robot motion information. The Unscented Kalman Filter is used as an efficient sensor fusion algorithm. The UKF is an advanced filtering technique which outperforms widely used Extended Kalman Filter (EKF) in many applications. The system is able to compensate for the slip errors by switching between two different UKF models built for slip and no-slip cases. Since the accelerometer error accumulates with time because of the double integration, the system uses accelerometer data only for the slip case UKF model. Using sensor fusion by UKF, the position and orientation of the robot is estimated and is used to rotate the camera mounted on top of the robot towards a fixed target. This concept is derived from the vestibule-ocular reflex (VOR) of the human eye. The experimental results show that the system is able to track the fixed target in various robot motion scenarios including the scenario when an intentional slip is generated during robot navigation

    A bank of unscented Kalman filters for multimodal human perception with mobile service robots

    Get PDF
    A new generation of mobile service robots could be ready soon to operate in human environments if they can robustly estimate position and identity of surrounding people. Researchers in this field face a number of challenging problems, among which sensor uncertainties and real-time constraints. In this paper, we propose a novel and efficient solution for simultaneous tracking and recognition of people within the observation range of a mobile robot. Multisensor techniques for legs and face detection are fused in a robust probabilistic framework to height, clothes and face recognition algorithms. The system is based on an efficient bank of Unscented Kalman Filters that keeps a multi-hypothesis estimate of the person being tracked, including the case where the latter is unknown to the robot. Several experiments with real mobile robots are presented to validate the proposed approach. They show that our solutions can improve the robot's perception and recognition of humans, providing a useful contribution for the future application of service robotics

    Computationally efficient solutions for tracking people with a mobile robot: an experimental evaluation of Bayesian filters

    Get PDF
    Modern service robots will soon become an essential part of modern society. As they have to move and act in human environments, it is essential for them to be provided with a fast and reliable tracking system that localizes people in the neighbourhood. It is therefore important to select the most appropriate filter to estimate the position of these persons. This paper presents three efficient implementations of multisensor-human tracking based on different Bayesian estimators: Extended Kalman Filter (EKF), Unscented Kalman Filter (UKF) and Sampling Importance Resampling (SIR) particle filter. The system implemented on a mobile robot is explained, introducing the methods used to detect and estimate the position of multiple people. Then, the solutions based on the three filters are discussed in detail. Several real experiments are conducted to evaluate their performance, which is compared in terms of accuracy, robustness and execution time of the estimation. The results show that a solution based on the UKF can perform as good as particle filters and can be often a better choice when computational efficiency is a key issue

    Multisensor-based human detection and tracking for mobile service robots

    Get PDF
    The one of fundamental issues for service robots is human-robot interaction. In order to perform such a task and provide the desired services, these robots need to detect and track people in the surroundings. In the present paper, we propose a solution for human tracking with a mobile robot that implements multisensor data fusion techniques. The system utilizes a new algorithm for laser-based legs detection using the on-board LRF. The approach is based on the recognition of typical leg patterns extracted from laser scans, which are shown to be very discriminative also in cluttered environments. These patterns can be used to localize both static and walking persons, even when the robot moves. Furthermore, faces are detected using the robot's camera and the information is fused to the legs position using a sequential implementation of Unscented Kalman Filter. The proposed solution is feasible for service robots with a similar device configuration and has been successfully implemented on two different mobile platforms. Several experiments illustrate the effectiveness of our approach, showing that robust human tracking can be performed within complex indoor environments

    Multisensor data fusion for joint people tracking and identification with a service robot

    Get PDF
    Tracking and recognizing people are essential skills modern service robots have to be provided with. The two tasks are generally performed independently, using ad-hoc solutions that first estimate the location of humans and then proceed with their identification. The solution presented in this paper, instead, is a general framework for tracking and recognizing people simultaneously with a mobile robot, where the estimates of the human location and identity are fused using probabilistic techniques. Our approach takes inspiration from recent implementations of joint tracking and classification, where the considered targets are mainly vehicles and aircrafts in military and civilian applications. We illustrate how people can be robustly tracked and recognized with a service robot using an improved histogram-based detection and multisensor data fusion. Some experiments in real challenging scenarios show the good performance of our solution

    Navigation and Control of Mobile Robot Using Sensor Fusion

    Get PDF
    Non

    A decentralized motion coordination strategy for dynamic target tracking

    Get PDF
    This paper presents a decentralized motion planning algorithm for the distributed sensing of a noisy dynamical process by multiple cooperating mobile sensor agents. This problem is motivated by localization and tracking tasks of dynamic targets. Our gradient-descent method is based on a cost function that measures the overall quality of sensing. We also investigate the role of imperfect communication between sensor agents in this framework, and examine the trade-offs in performance between sensing and communication. Simulations illustrate the basic characteristics of the algorithms
    corecore