2 research outputs found

    Visual Tracking Modalities for a Companion Robot

    No full text
    Abstract — This article presents the development of a humanrobot interaction mechanism based on vision. The functionalities required for such mechanism range from user detection and recognition, to gesture tracking. Particle filters, which are extensively described in the literature, are well suited to this context as they enable a straight combination of several visual cues like colour, shape or motion. Additionally, different algorithms can be considered for a better handling of the particles depending of the context. This article presents the visual functionalities developed namely user recognition and following, and 3D gestures tracking. The challenge is to find which algorithms and visual cues fulfil the best, the requirements of the considered functionalities for our companion robot. The employed methods to attain these required functionalities and their results are presented. I. INTRODUCTION AND FRAMEWOR

    Visual Tracking Modalities for a Companion Robot

    No full text
    Abstract — This article presents the development of a humanrobot interaction mechanism based on vision. The functionalities required for such mechanism range from user detection and recognition, to gesture tracking. Particle filters, which are extensively described in the literature, are well suited to this context as they enable a straight combination of several visual cues like colour, shape or motion. Additionally, different algorithms can be considered for a better handling of the particles depending of the context. This article presents the visual functionalities developed namely user recognition and following, and 3D gestures tracking. The challenge is to find which algorithms and visual cues fulfil the best, the requirements of the considered functionalities for our companion robot. The employed methods to attain these required functionalities and their results are presented. I. INTRODUCTION AND FRAMEWOR
    corecore