Impact of Movements on Facial Expression Recognition

Abstract

The ability to recognize human emotions can be a useful skill for robots. Emotion recognition can help robots understand our responses to robot movements and actions. Human emotions can be recognized through facial expressions. Facial Expression Recognition (FER) is a well-established research area, how- ever, the majority of prior research is based on static datasets of images. With robots often the subject is moving, the robot is moving, or both. The purpose of this research is to determine the impact of movement on facial expression recognition. We apply a pre-existing model for FER, which performs around 70.86% on a given collection of images. We experiment with three different conditions: No motion by subject or robot, motion by one of the human or robot, and finally both human and robot in motion. We then measure the impact on FER accuracy introduced by these movements. This research relates to Computer Vision, Machine Learning, and Human-Robot Interaction

    Similar works