In dyadic human interactions, mutual influence- a person’s in-fluence on the interacting partner’s behaviors- is shown to be important and could be incorporated into the modeling frame-work in characterizing, and automatically recognizing the par-ticipants ’ states. We propose a Dynamic Bayesian Network (DBN) to explicitly model the conditional dependency between two interacting partners ’ emotion states in a dialog using data from the IEMOCAP corpus of expressive dyadic spoken in-teractions. Also, we focus on automatically computing the Valence-Activation emotion attributes to obtain a continuous characterization of the participants ’ emotion flow. Our pro-posed DBNmodels the temporal dynamics of the emotion states as well as the mutual influence between speakers in a dialog. With speech based features, the proposed network improves classification accuracy by 3.67 % absolute and 7.12 % relative over the Gaussian Mixture Model (GMM) baseline on isolated turn-by-turn emotion classification. Index Terms: emotion recognition, mutual influence, Dynamic Bayesian Network, dyadic interactio
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.