Mood disorders, including depression and anxiety, often manifest through
facial expressions. While previous research has explored the connection between
facial features and emotions, machine learning algorithms for estimating mood
disorder severity have been hindered by small datasets and limited real-world
application. To address this gap, we analyzed facial videos of 11,427
participants, a dataset two orders of magnitude larger than previous studies.
This comprehensive collection includes standardized facial expression videos
from reading tasks, along with a detailed psychological scale that measures
depression, anxiety, and stress. By examining the relationships among these
emotional states and employing clustering analysis, we identified distinct
subgroups embodying different emotional profiles. We then trained tree-based
classifiers and deep learning models to estimate emotional states from facial
features. Results indicate that models previously effective on small datasets
experienced decreased performance when applied to our large dataset,
highlighting the importance of data scale and mitigating overfitting in
practical settings. Notably, our study identified subtle shifts in pupil
dynamics and gaze orientation as potential markers of mood disorders, providing
valuable information on the interaction between facial expressions and mental
health. This research marks the first large-scale and comprehensive
investigation of facial expressions in the context of mental health, laying the
groundwork for future data-driven advancements in this field