Skip to main content
Article thumbnail
Location of Repository

Facial Features Tracking for Gross Head Movement Analysis and Expression Recognition

By Dimitris Metaxas


Abstract — We present a real-time framework for Action Units(AU) and Expression recognition based on facial features tracking and Adaboost. Accurate feature tracking faces several challenges due to changes in illumination, subject’s skin color, large head rotations, partial occlusions and fast head movements. We use models based on Active Shapes to localize facial features on the face in a generic pose. Shapes of facial features undergo non-linear transformation as the head rotates from frontal view to profile view. We learn the non-linear shape manifold as multipleoverlapping subspaces with different subspaces representing different head poses. Further, we use the tracked features to accurately extract bounded faces in a video sequence and use it for recognizing facial expressions. Our approach is based on coded dynamical features. In order to capture the dynamical characteristics of facial events, we design the dynamical haar-lik

Year: 2009
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.