1 research outputs found
Detecting depression in dyadic conversations with multimodal narratives and visualizations
Conversations contain a wide spectrum of multimodal information that gives us
hints about the emotions and moods of the speaker. In this paper, we developed
a system that supports humans to analyze conversations. Our main contribution
is the identification of appropriate multimodal features and the integration of
such features into verbatim conversation transcripts. We demonstrate the
ability of our system to take in a wide range of multimodal information and
automatically generated a prediction score for the depression state of the
individual. Our experiments showed that this approach yielded better
performance than the baseline model. Furthermore, the multimodal narrative
approach makes it easy to integrate learnings from other disciplines, such as
conversational analysis and psychology. Lastly, this interdisciplinary and
automated approach is a step towards emulating how practitioners record the
course of treatment as well as emulating how conversational analysts have been
analyzing conversations by hand.Comment: 12 page