Micro-facial expressions are fast, subtle movements of facial muscles that occur
when someone is attempting to conceal their true emotion. Detecting these movements
for a human is di�cult, as the movement could appear and disappear within
half of a second. Recently, research into detecting micro-facial movements using
computer vision and other techniques has emerged with the aim of outperforming
a human. The motivation behind a lot of this research is the potential applications
in security, healthcare and emotional-based training. The research has also
introduced some ethical concerns on whether it is okay to detect micro-movements
when people do not know they are showing them.
The main aim of this thesis is to investigate and develop novel ways of detecting
micro-facial movements using features based in the spatial and temporal domains.
The contributions towards this aim are: an extended feature descriptor to
describe micro-facial movement namely Local Binary Patterns on Three Orthogonal
Planes (LBP-TOP) combined with Gaussian Derivatives (GD); a dataset
of spontaneously induced micro-facial movements, namely Spontaneous Activity
of Micro-Movements (SAMM); an individualised baseline method for micromovement
detection that forms an Adaptive Baseline Threshold (ABT); Facial
Action Coding System (FACS)-based regions are proposed to focus on the local
movement of relevant facial areas.
The LBP-TOP with GD feature was developed to improve on an established
feature and use the GD to enhance the facial features. Using machine learning,
the method performs well achieving an accuracy of 92.6%. Next a new dataset,
SAMM, was introduced that improved on the limitations of previous sets, including
a wider demographic, increased resolution and comprehensively FACS coded.
An individualised baseline method was the introduced and tested using the new
dataset. Using feature di�erence instead of machine learning, the performance increased
with a recall of 0.8429 on the maximum thresholding and a further increase
of the recall to 0.9125 when using the ABT. To increase the relevance of what
is being processed on the face, FACS-based regions were created. By focusing
on local regions and individualised baselines, this method outperformed similar
state-of-the-art with an Area Under Curve (AUC) of 0.7513.
The research into detecting micro-movements is still in it's infancy, and much
more can be done to advance this �eld. While machine learning can �nd patterns
in normal facial expressions, it is the feature di�erence methods that perform the
best when detecting the subtle changes of the face. By using this and comparing
the movement against a person's baseline, the micro-movements can �nally be
accurately detected