We present a method of bidirectional interaction between a human and a humanoid robot in terms of emotional expressions. The robot is able to detect continuous transitions of human emotions that ranges from very sad to very happy using Active Appearance Models (AAMs) and Neural Evolution Algorithm to determinate the face shape and gestures. As a response of the human emotions, the robot performs postural reactions that dynamically adapt to the human expressions, performing a body language which changes in terms of intensity as the human emotions vary. Our method is implemented in the HOAP-3 humanoid robot.The research leading to these results has received funding from the COMANDER project CCG10-UC3M/DPI-5350 funded by Comunidad de Madrid and UC3M (University Carlos III of Madrid), and ARCADIA project DPI2010-21047-C02-01 funded by CICYT project grant on behalf of Spanish Ministry of Economy and Competitiveness.Publicad
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.