34 research outputs found

    Multisensor-based human detection and tracking for mobile service robots

    Get PDF
    The one of fundamental issues for service robots is human-robot interaction. In order to perform such a task and provide the desired services, these robots need to detect and track people in the surroundings. In the present paper, we propose a solution for human tracking with a mobile robot that implements multisensor data fusion techniques. The system utilizes a new algorithm for laser-based legs detection using the on-board LRF. The approach is based on the recognition of typical leg patterns extracted from laser scans, which are shown to be very discriminative also in cluttered environments. These patterns can be used to localize both static and walking persons, even when the robot moves. Furthermore, faces are detected using the robot's camera and the information is fused to the legs position using a sequential implementation of Unscented Kalman Filter. The proposed solution is feasible for service robots with a similar device configuration and has been successfully implemented on two different mobile platforms. Several experiments illustrate the effectiveness of our approach, showing that robust human tracking can be performed within complex indoor environments

    Robust human detection with occlusion handling by fusion of thermal and depth images from mobile robot

    Get PDF
    In this paper, a robust surveillance system to enable robots to detect humans in indoor environments is proposed. The proposed method is based on fusing information from thermal and depth images which allows the detection of human even under occlusion. The proposed method consists of three stages, pre-processing, ROI generation and object classification. A new dataset was developed to evaluate the performance of the proposed method. The experimental results show that the proposed method is able to detect multiple humans under occlusions and illumination variations

    A long-term Human-Robot Proxemic study

    Get PDF
    “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.”A long-term Human-Robot Proxemic (HRP) study was performed using a newly developed Autonomous Proxemic System (APS) for a robot to measure and control the approach distances to the human participants. The main findings were that most HRP adaptation occurred in the first two interaction sessions, and for the remaining four weeks, approach distance preferences remained relatively steady, apart from some short periods of increased distances for some participants. There were indications that these were associated with episodes where the robot malfunctioned, so this raises the possibility of users trust in the robot affecting HRP distance. The study also found that approach distances for humans approaching the robot and the robot approaching the human were comparable, though there were indications that humans preferred to approach the robot more closely than they allowed the robot to approach them in a physically restricted area. Two participants left the study prematurely, stating they were bored with the repetitive experimental procedures. This highlights issues related to the often incompatible demands of keeping experimental controlled conditions vs. having realistic, engaging and varied HRI trial scenarios

    Human detection using laser rangefinder

    Get PDF
    Tato práce se zabývá detekcí dynamických objektů, především lidí, a sledování jejich pohybu. Detekce je založena na datech z laserového dálkoměru. Sledování pohybu je založeno na porovnávání dat z více skenů. Algoritmus metody lokalizace je zpracován v programu Matlab. V programu Matlab je vytvořena simulace skenovaných dat a na těchto datech testovány a optimalizovány parametry filtrů algoritmu.This thesis describe dynamic obstacle detection and move tracking. Detection is based on laser rangefinder data. Tracking is based on compare two scans. Algorithm of localization method is processed in Matlab. In Matlab is also made simulation of scanned data. Filter parameters are tested and optimalized on this data.

    People Detection and Tracking Using LIDAR Sensors

    Get PDF
    Special Issue Robotics in Spain 2019[EN] The tracking of people is an indispensable capacity in almost any robotic application. A relevant case is the @home robotic competitions, where the service robots have to demonstrate that they possess certain skills that allow them to interact with the environment and the people who occupy it; for example, receiving the people who knock at the door and attending them as appropriate. Many of these skills are based on the ability to detect and track a person. It is a challenging problem, particularly when implemented using low-definition sensors, such as Laser Imaging Detection and Ranging (LIDAR) sensors, in environments where there are several people interacting. This work describes a solution based on a single LIDAR sensor to maintain a continuous identification of a person in time and space. The system described is based on the People Tracker package, aka PeTra, which uses a convolutional neural network to identify person legs in complex environments. A new feature has been included within the system to correlate over time the people location estimates by using a Kalman filter. To validate the solution, a set of experiments have been carried out in a test environment certified by the European Robotic League.SIJunta de Castilla y León (LE028P17)Comunidad de Madrid (RoboCity2030-Fase 3

    KALMAN FILTER AND NARX NEURAL NETWORK FOR ROBOT VISION BASED HUMAN TRACKING

    Get PDF
    Tracking human is an important and challenging problem in video-based intelligent robot systems. In this paper, a vision-based human tracking system is supposed to provide sensor input for vision-based control of a mobile robot that works in a team helping the human co-worker. A comparison between NARX neural network and Kalman filter in solving the prediction problem of human tracking in robot vision is presented. After collecting video data from a robot, simulation results obtained from the Kalman filter model are used to compare with the simulation results obtained from the NARX Neural network.Key words: robot vision, Kalman filter, neural networks, human trackin
    corecore