12 research outputs found

    Emotional Speech Perception Unfolding in Time: The Role of the Basal Ganglia

    Get PDF
    The basal ganglia (BG) have repeatedly been linked to emotional speech processing in studies involving patients with neurodegenerative and structural changes of the BG. However, the majority of previous studies did not consider that (i) emotional speech processing entails multiple processing steps, and the possibility that (ii) the BG may engage in one rather than the other of these processing steps. In the present study we investigate three different stages of emotional speech processing (emotional salience detection, meaning-related processing, and identification) in the same patient group to verify whether lesions to the BG affect these stages in a qualitatively different manner. Specifically, we explore early implicit emotional speech processing (probe verification) in an ERP experiment followed by an explicit behavioral emotional recognition task. In both experiments, participants listened to emotional sentences expressing one of four emotions (anger, fear, disgust, happiness) or neutral sentences. In line with previous evidence patients and healthy controls show differentiation of emotional and neutral sentences in the P200 component (emotional salience detection) and a following negative-going brain wave (meaning-related processing). However, the behavioral recognition (identification stage) of emotional sentences was impaired in BG patients, but not in healthy controls. The current data provide further support that the BG are involved in late, explicit rather than early emotional speech processing stages

    Making social robots more attractive: the effects of voice pitch, humor and empathy

    Get PDF
    In this paper we explore how simple auditory/verbal features of the spoken language, such as voice characteristics (pitch) and language cues (empathy/humor expression) influence the quality of interaction with a social robot receptionist. For our experiment two robot characters were created: Olivia, the more extrovert, exuberant, and humorous robot with a higher voice pitch and Cynthia, the more introvert, calmer and more serious robot with a lower voice pitch. Our results showed that the voice pitch seemed to have a strong influence on the way users rated the overall interaction quality, as well as the robot's appeal and overall enjoyment. Further, the humor appeared to improve the users' perception of task enjoyment, robot personality and speaking style while the empathy showed effects on the way users evaluated the robot's receptive behavior and the interaction ease. With our study, we would like to stress in particular the importance of voice pitch in human robot interaction and to encourage further research on this topic
    corecore