Missing diversity, equity, and inclusion elements in affective computing
datasets directly affect the accuracy and fairness of emotion recognition
algorithms across different groups. A literature review reveals how affective
computing systems may work differently for different groups due to, for
instance, mental health conditions impacting facial expressions and speech or
age-related changes in facial appearance and health. Our work analyzes existing
affective computing datasets and highlights a disconcerting lack of diversity
in current affective computing datasets regarding race, sex/gender, age, and
(mental) health representation. By emphasizing the need for more inclusive
sampling strategies and standardized documentation of demographic factors in
datasets, this paper provides recommendations and calls for greater attention
to inclusivity and consideration of societal consequences in affective
computing research to promote ethical and accurate outcomes in this emerging
field.Comment: 8 pages, 2023 11th International Conference on Affective Computing
and Intelligent Interaction (ACII