16 research outputs found

    Pupil dilation as an index of preferred mutual gaze duration

    Get PDF
    Most animals look at each other to signal threat or interest. In humans, this social interaction is usually punctuated with brief periods of mutual eye contact. Deviations from this pattern of gazing behaviour generally make us feel uncomfortable and are a defining characteristic of clinical conditions such as autism or schizophrenia, yet it is unclear what constitutes normal eye contact. Here, we measured, across a wide range of ages, cultures and personality types, the period of direct gaze that feels comfortable and examined whether autonomic factors linked to arousal were indicative of people’s preferred amount of eye contact. Surprisingly, we find that preferred period of gaze duration is not dependent on fundamental characteristics such as gender, personality traits or attractiveness. However, we do find that subtle pupillary changes, indicative of physiological arousal, correlate with the amount of eye contact people find comfortable. Specifically, people preferring longer durations of eye contact display faster increases in pupil size when viewing another person than those preferring shorter durations. These results reveal that a person’s preferred duration of eye contact is signalled by physiological indices (pupil dilation) beyond volitional control that may play a modulatory role in gaze behaviour

    Facial communicative signals: valence recognition in task-oriented human-robot interaction

    No full text
    From the issue entitled "Measuring Human-Robots Interactions" This paper investigates facial communicative signals (head gestures, eye gaze, and facial expressions) as nonverbal feedback in human-robot interaction. Motivated by a discussion of the literature, we suggest scenario-specific investigations due to the complex nature of these signals and present an object-teaching scenario where subjects teach the names of objects to a robot, which in turn shall term these objects correctly afterwards. The robot’s verbal answers are to elicit facial communicative signals of its interaction partners. We investigated the human ability to recognize this spontaneous facial feedback and also the performance of two automatic recognition approaches. The first one is a static approach yielding baseline results, whereas the second considers the temporal dynamics and achieved classification rate
    corecore