53 research outputs found

    Towards a cognitive framework for multimodal person recognition in multiparty hri

    No full text
    The ability to recognize human partners is an important social skill to build personalized and long-Term Human-Robot Interactions (HRI). However, in HRI contexts, unfolding in ever-changing and realistic environments, the identification problem presents still significant challenges. Possible solutions consist of relying on a multimodal approach and making robots learn from their first-hand sensory data. To this aim, we propose a framework to allow robots to autonomously organize their sensory experience into a structured dataset suitable for person recognition during a multiparty interaction. Our results demonstrate the effectiveness of our approach and show that it is a promising solution in the quest of making robots more autonomous in their learning process
    corecore