9,197 research outputs found

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    The Neurological Traces of Look-Alike Avatars

    Get PDF
    We designed an observational study where participants (n = 17) were exposed to pictures and look-alike avatars pictures of themselves, a familiar friend or an unfamiliar person. By measuring participants’ brain activity with electroencephalography (EEG), we found face-recognition event related potentials (ERPs) in the visual cortex, around 200–250 ms, to be prominent for the different familiarity levels. A less positive component was found for self-recognized pictures (P200) than pictures of others, showing similar effects in both real faces and look-alike avatars. A rapid adaptation in the same component was found when comparing the neural processing of avatar faces vs. real faces, as if avatars in general were assimilated as real face representations over time. ERP results also showed that in the case of the self-avatar, the P200 component correlated with more complex conscious encodings of self-representation, i.e., the difference in voltage in the P200 between the self-avatar and the self-picture was reduced in participants that felt the avatar looked like them. This study is put into context within the literature of self-recognition and face recognition in the visual cortex. Additionally, the implications of these results on look-alike avatars are discussed both for future virtual reality (VR) and neuroscience studies
    • …
    corecore