38,224 research outputs found

    Holistic gaze strategy to categorize facial expression of varying intensities

    Get PDF
    Using faces representing exaggerated emotional expressions, recent behaviour and eye-tracking studies have suggested a dominant role of individual facial features in transmitting diagnostic cues for decoding facial expressions. Considering that in everyday life we frequently view low-intensity expressive faces in which local facial cues are more ambiguous, we probably need to combine expressive cues from more than one facial feature to reliably decode naturalistic facial affects. In this study we applied a morphing technique to systematically vary intensities of six basic facial expressions of emotion, and employed a self-paced expression categorization task to measure participants’ categorization performance and associated gaze patterns. The analysis of pooled data from all expressions showed that increasing expression intensity would improve categorization accuracy, shorten reaction time and reduce number of fixations directed at faces. The proportion of fixations and viewing time directed at internal facial features (eyes, nose and mouth region), however, was not affected by varying levels of intensity. Further comparison between individual facial expressions revealed that although proportional gaze allocation at individual facial features was quantitatively modulated by the viewed expressions, the overall gaze distribution in face viewing was qualitatively similar across different facial expressions and different intensities. It seems that we adopt a holistic viewing strategy to extract expressive cues from all internal facial features in processing of naturalistic facial expressions

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    Machine Understanding of Human Behavior

    Get PDF
    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should be about anticipatory user interfaces that should be human-centered, built for humans based on human models. They should transcend the traditional keyboard and mouse to include natural, human-like interactive functions including understanding and emulating certain human behaviors such as affective and social signaling. This article discusses a number of components of human behavior, how they might be integrated into computers, and how far we are from realizing the front end of human computing, that is, how far are we from enabling computers to understand human behavior

    Generation of Whole-Body Expressive Movement Based on Somatical Theories

    Get PDF
    An automatic choreography method to generate lifelike body movements is proposed. This method is based on somatics theories that are conventionally used to evaluate human’s psychological and developmental states by analyzing the body movement. The idea of this paper is to use the theories in the inverse way: to facilitate generation of artificial body movements that are plausible regarding evolutionary, developmental and emotional states of robots or other non-living movers. This paper reviews somatic theories and describes a strategy for implementations of automatic body movement generation. In addition, a psychological experiment is reported to verify expression ability on body movement rhythm. This method facilitates to choreographing body movement of humanoids, animal-shaped robots, and computer graphics characters in video games

    A Conceptual Framework for Motion Based Music Applications

    Get PDF
    Imaginary projections are the core of the framework for motion based music applications presented in this paper. Their design depends on the space covered by the motion tracking device, but also on the musical feature involved in the application. They can be considered a very powerful tool because they allow not only to project in the virtual environment the image of a traditional acoustic instrument, but also to express any spatially defined abstract concept. The system pipeline starts from the musical content and, through a geometrical interpretation, arrives to its projection in the physical space. Three case studies involving different motion tracking devices and different musical concepts will be analyzed. The three examined applications have been programmed and already tested by the authors. They aim respectively at musical expressive interaction (Disembodied Voices), tonal music knowledge (Harmonic Walk) and XX century music composition (Hand Composer)

    Emotion capture based on body postures and movements

    Full text link
    In this paper we present a preliminary study for designing interactive systems that are sensible to human emotions based on the body movements. To do so, we first review the literature on the various approaches for defining and characterizing human emotions. After justifying the adopted characterization space for emotions, we then focus on the movement characteristics that must be captured by the system for being able to recognize the human emotions.Comment: 22 page

    Stephen Davies on the Issue of Literalism

    Get PDF
    In this paper I discuss Stephen Davies’s defence of literalism about emotional descriptions of music. According to literalism, a piece of music literally possesses the expressive properties we attribute to it when we describe it as ‘sad’, ‘happy’, etc. Davies’s literalist strategy exploits the concept of polysemy: the meaning of emotion words in descriptions of expressive music is related to the meaning of those words when used in their primary psychological sense. The relation between the two meanings is identified by Davies in music’s presentation of emotion-characteristics-in-appearance. I will contend that there is a class of polysemous uses of emotion terms in descriptions of music that is not included in Davies’s characterization of the link between emotions in music and emotions as psychological states. I conclude by indicating the consequences of my claim for the phenomenology of expressive music
    corecore