53 research outputs found

    Spatial selectivity in adaptation to gaze direction

    Full text link
    A person's focus of attention is conveyed by the direction of their eyes and face, providing a simple visual cue fundamental to social interaction. A growing body of research examines the visual mechanisms that encode the direction of another person's gaze as we observe them. Here we investigate the spatial receptive field properties of these mechanisms, by testing the spatial selectivity of sensory adaptation to gaze direction. Human observers were adapted to faces with averted gaze presented in one visual hemifield, then tested in their perception of gaze direction for faces presented in the same or opposite hemifield. Adaptation caused strong, repulsive perceptual aftereffects, but only for faces presented in the same hemifield as the adapter. This occurred even though adapting and test stimuli were in the same external location across saccades. Hence, there was clear evidence for retinotopic adaptation and a relative lack of either spatiotopic or spatially invariant adaptation. These results indicate that adaptable representations of gaze direction in the human visual system have retinotopic spatial receptive fields. This strategy of coding others' direction of gaze with positional specificity relative to one's own eye position may facilitate key functions of gaze perception, such as socially cued shifts in visual attention

    Gaze constancy in upright and inverted faces

    Get PDF
    This work is supported by Australian Research Council Discovery Project [DP120102589]; CC is supported by an Australian Research Council Future Fellowship

    Testing the dual-route model of perceived gaze direction: Linear combination of eye and head cues

    Get PDF
    This work is supported by Australian Research Council Discovery Project [DP120102589 & DP160102239] to CC. IM is supported by a Leverhulme Project Grant RPG-2013-218. YO is supported by a Grant-in-Aid for Research Activity Start-up [15H06456] from the Japan Society for the Promotion of Science. We thank Matthew Patten for his help in data collection

    Gaze Behavior as a Visual Cue to Animacy

    Full text link
    A characteristic that distinguishes biological agents from inanimate objects is that the former can have a direction of attention. While it is natural to associate a person’s direction of attention with the appearance of their face, attentional behaviors are also a kind of relational motion, in which an entity rotates a specific axis of its form in relation to an independent feature of its environment. Here, we investigated the role of gaze-like motion in providing a visual cue to animacy independent of the human form. We generated animations in which the rotation of a geometric object (the agent) was dependent on the movement of a target. Participants made judgements about how creature-like the objects appeared, which were highly sensitive to the correspondence between objects over and above their individual motion. We varied the dependence between agent rotation and target motion in terms of temporal synchrony, temporal order, cross-correlation, and trajectory complexity. These affected perceptions of animacy to differing extents. When the behavior of the agent was driven by a model of predictive tracking with a sensory sampling delay, perceived animacy was broadly tuned across changes in rotational behavior induced by the sampling delay of the agent. Overall, the tracking relationship provides a salient cue to animacy independent of biological form, provided that temporal synchrony between objects is within a certain range. This motion relationship may be one to which the visual system is highly attuned, due to its association with attentional behavior and the presence of other minds in our environment

    Adaptation to Walking Direction in Biological Motion

    Full text link
    The direction that we see another person walking provides us with an important cue to their intentions, but little is known about how the brain encodes walking direction across a neuronal population. The current study used an adaptation technique to investigate the sensory coding of perceived walking direction. We measured perceived walking direction of point-light stimuli before and after adaptation, and found that adaptation to a specific walking direction resulted in repulsive perceptual aftereffects. The magnitude of these aftereffects was tuned to the walking direction of the adaptor relative to the test, with local repulsion of perceived walking direction for test stimuli oriented on either side of the adapted walking direction. The specific tuning profiles that we observed are well explained by a population-coding model, in which perceived walking direction is coded in terms of the relative activity across a bank of sensory channels with peak tuning distributed across the full 360° range of walking directions. Further experiments showed specificity in how horizontal (azimuth) walking direction is coded when moving away from the observer compared to when moving toward the observer. Moreover, there was clear specificity in these perceptual aftereffects for walking direction compared to a nonbiological form of 3D motion (a rotating sphere). These results indicate the existence of neural mechanisms in the human visual system tuned to specific walking directions, provide insight into the number of sensory channels and how their responses are combined to encode walking direction, and demonstrate the specificity of adaptation to biological motion

    Is there a ‘zone of eye contact’ within the borders of the face?

    Full text link
    Eye contact is a salient feature of everyday interactions, yet it is not obvious what the physical conditions are under which we feel that we have eye contact with another person. Here we measure the range of locations that gaze can fall on a person's face to elicit a sense of eye contact. Participants made judgements about eye contact while viewing rendered images of faces with finely-varying gaze direction at a close interpersonal distance (50 cm). The ‘zone of eye contact’ tends to peak between the two eyes and is often surprisingly narrower than the observer's actual eye region. Indeed, the zone tends to extend further across the face in height than in width. This shares an interesting parallel with the ‘cyclopean eye’ of visual perspective – our sense of looking out from a single point in space despite the physical separation of our two eyes. The distribution of eye-contact strength across the face can be modelled at the individual-subject level as a 2D Gaussian function. Perception of eye contact is more precise than the sense of having one's face looked at, which captures a wider range of gaze locations in both the horizontal and vertical dimensions, at least at the close viewing distance used in the present study. These features of eye-contact perception are very similar cross-culturally, tested here in Australian and Japanese university students. However, the shape and position of the zone of eye contact does vary depending on recent sensory experience: adaptation to faces with averted gaze causes a pronounced shift and widening of the zone across the face, and judgements about eye contact also show a positive serial dependence. Together, these results provide insight into the conditions under which eye contact is felt, with respect to face morphology, culture, and sensory context

    Autistic adults show preserved normalisation of sensory responses in gaze processing

    Get PDF
    Progress in our understanding of autism spectrum disorder (ASD) has recently been sought by characterising how systematic differences in canonical neural computations employed across the sensory cortex might contribute to clinical symptoms in diverse sensory, cognitive, and social domains. A key proposal is that ASD is characterised by reduced divisive normalisation of sensory responses. This provides a bridge between genetic and molecular evidence for an increased ratio of cortical excitation to inhibition in ASD and the functional characteristics of sensory coding that are relevant for understanding perception and behaviour. Here we tested this hypothesis in the context of gaze processing (i.e., the perception of other people's direction of gaze), a domain with direct relevance to the core diagnostic features of ASD. We show that reduced divisive normalisation in gaze processing is associated with specific predictions regarding the psychophysical effects of sensory adaptation to gaze direction, and test these predictions in adults with ASD. We report compelling evidence that both divisive normalisation and sensory adaptation occur robustly in adults with ASD in the context of gaze processing. These results have important theoretical implications for defining the types of divisive computations that are likely to be intact or compromised in this condition (e.g., relating to local vs distal control of cortical gain). These results are also a strong testament to the typical sensory coding of gaze direction in ASD, despite the atypical responses to others' gaze that are a hallmark feature of this diagnosis.This research was supported by a Wellcome Trust Senior Clinical Research Fellowship (100227) awarded to GR and Australian Research Council Discovery Project (DP160102239) awarded to CC. This work was enabled partly by a study visit grant to CP from the Experimental Psychology Society. We thank all the participants who gave up their time to take part in this research

    A Bayesian approach to person perception

    Get PDF
    © 2015 Elsevier Inc. Here we propose a Bayesian approach to person perception, outlining the theoretical position and a methodological framework for testing the predictions experimentally. We use the term person perception to refer not only to the perception of others' personal attributes such as age and sex but also to the perception of social signals such as direction of gaze and emotional expression. The Bayesian approach provides a formal description of the way in which our perception combines current sensory evidence with prior expectations about the structure of the environment. Such expectations can lead to unconscious biases in our perception that are particularly evident when sensory evidence is uncertain. We illustrate the ideas with reference to our recent studies on gaze perception which show that people have a bias to perceive the gaze of others as directed towards themselves. We also describe a potential application to the study of the perception of a person's sex, in which a bias towards perceiving males is typically observed.This work is supported by Australian Research Council Discovery Project DP120102589. CC is supported by Australian Research Council Future Fellowship FT110100150
    • …
    corecore