13 research outputs found

    Marker-Less Augmented Reality for Human Robot Interaction

    No full text
    This paper presents the marker-less augmented reality system for in-situ visualization of robot’s plans to the human operator. The system finds the natural features in the environment and builds the 3D map of the working space during the mapping phase. The stereo from motion method is utilized to compute the 3D position of natural features, while the position of the camera is computed from the artificial markers placed in the working space. Therefore the map is build in the fixed frame of reference frame provided by artificial markers. When the whole working space is mapped, artificial markers are not required for the functionality of the augmented reality system. The actually seen natural features are compared to those stored in the map and camera pose is estimated according found correspondences. The main advantages are that no artificial markers are necessary during regular use of the system, and that method does not rely on the tracking. Even the single frame is sufficient to compute the pose of camera and visualize the robot’s plan. As there is a big number of natural features in the environment, the precision of the camera pose estimation is sufficient, when the camera is looking into the mapped working space

    A closed-loop approach for tracking a humanoid robot using particle filtering and depth data

    Get PDF
    Humanoid robots introduce instabilities during biped march that complicate the process of estimating their position and orientation along time. Tracking humanoid robots may be useful not only in typical applications such as navigation, but in tasks that require benchmarking the multiple processes that involve registering measures about the performance of the humanoid during walking. Small robots represent an additional challenge due to their size and mechanic limitations which may generate unstable swinging while walking. This paper presents a strategy for the active localization of a humanoid robot in environments that are monitored by external devices. The problem is faced using a particle filter method over depth images captured by an RGB-D sensor in order to effectively track the position and orientation of the robot during its march. The tracking stage is coupled with a locomotion system controlling the stepping of the robot toward a given oriented target. We present an integral communication framework between the tracking and the locomotion control of the robot based on the robot operating system, which is capable of achieving real-time locomotion tasks using a NAO humanoid robot.The final publication is available at Springer via http://dx.doi.org/10.1007/s11370-017-0230-0Peer Reviewe
    corecore