4 research outputs found

    A Weakly Supervised Approach to Emotion-change Prediction and Improved Mood Inference

    Full text link
    Whilst a majority of affective computing research focuses on inferring emotions, examining mood or understanding the \textit{mood-emotion interplay} has received significantly less attention. Building on prior work, we (a) deduce and incorporate emotion-change (Δ\Delta) information for inferring mood, without resorting to annotated labels, and (b) attempt mood prediction for long duration video clips, in alignment with the characterisation of mood. We generate the emotion-change (Δ\Delta) labels via metric learning from a pre-trained Siamese Network, and use these in addition to mood labels for mood classification. Experiments evaluating \textit{unimodal} (training only using mood labels) vs \textit{multimodal} (training using mood plus Δ\Delta labels) models show that mood prediction benefits from the incorporation of emotion-change information, emphasising the importance of modelling the mood-emotion interplay for effective mood inference.Comment: 9 pages, 3 figures, 6 tables, published in IEEE International Conference on Affective Computing and Intelligent Interactio

    Examining the Influence of Personality and Multimodal Behavior on Hireability Impressions

    No full text
    While personality traits have been traditionally modeled as behavioral constructs, we novelly posit \emph{job hireability} as a \emph{personality construct}. To this end, we examine correlates among personality and hireability measures on the \textit{First Impressions Candidate Screening} dataset. Modeling hireability as both a discrete and continuous variable, and the \emph{big-five} OCEAN personality traits as predictors, we utilize (a) multimodal behavioral cues, and (b) personality trait estimates obtained via these cues for hireability prediction (HP). For each of the \emph{text}, \emph{audio} and \emph{visual} modalities, HP via (b) is found to be more effective than (a). Also, superior results are achieved when hireability is modeled as a continuous rather than a categorical variable. Interestingly, eye and bodily visual cues perform comparably to facial cues for predicting personality and hireability. Explanatory analyses reveal that multimodal behaviors impact personality and hireability impressions: \textit{e.g.}, Conscientiousness impressions are impacted by the use of \textit{positive adjectives} (verbal behavior) and \emph{eye movements} (non-verbal behavior), confirming prior observations.<br/

    A Weakly Supervised Approach to Emotion-change Prediction and Improved Mood Inference

    No full text
    Whilst a majority of affective computing research focuses on inferring emotions, examining mood or understanding the mood-emotion interplay has received significantly less attention. Building on prior work, we (a) deduce and incorporate emotion-change (Δ) information for inferring mood, without resorting to annotated labels, and (b) attempt mood prediction for long duration video clips, in alignment with the characterisation of mood. We generate the emotion-change (Δ) labels via metric learning from a pre-trained Siamese Network, and use these in addition to mood labels for mood classification. Experiments evaluating unimodal (training only using mood labels) vs muttimodat (training using mood plus Δ labels) models show that mood prediction benefits from the incorporation of emotion-change information, emphasising the importance of modelling the moodemotion interplay for effective mood inference.</p
    corecore