82 research outputs found

    Culture shapes how we look at faces

    Get PDF
    Background: Face processing, amongst many basic visual skills, is thought to be invariant across all humans. From as early as 1965, studies of eye movements have consistently revealed a systematic triangular sequence of fixations over the eyes and the mouth, suggesting that faces elicit a universal, biologically-determined information extraction pattern. Methodology/Principal Findings: Here we monitored the eye movements of Western Caucasian and East Asian observers while they learned, recognized, and categorized by race Western Caucasian and East Asian faces. Western Caucasian observers reproduced a scattered triangular pattern of fixations for faces of both races and across tasks. Contrary to intuition, East Asian observers focused more on the central region of the face. Conclusions/Significance: These results demonstrate that face processing can no longer be considered as arising from a universal series of perceptual events. The strategy employed to extract visual information from faces differs across cultures

    Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions

    Get PDF
    Cognitive scientists have long been interested in the role that eye gaze plays in social interactions. Previous research suggests that gaze acts as a signaling mechanism and can be used to control turn-taking behaviour. However, early research on this topic employed methods of analysis that aggregated gaze information across an entire trial (or trials), which masks any temporal dynamics that may exist in social interactions. More recently, attempts have been made to understand the temporal characteristics of social gaze but little research has been conducted in a natural setting with two interacting participants. The present study combines a temporally sensitive analysis technique with modern eye tracking technology to 1) validate the overall results from earlier aggregated analyses and 2) provide insight into the specific moment-to-moment temporal characteristics of turn-taking behaviour in a natural setting. Dyads played two social guessing games (20 Questions and Heads Up) while their eyes were tracked. Our general results are in line with past aggregated data, and using cross-correlational analysis on the specific gaze and speech signals of both participants we found that 1) speakers end their turn with direct gaze at the listener and 2) the listener in turn begins to speak with averted gaze. Convergent with theoretical models of social interaction, our data suggest that eye gaze can be used to signal both the end and the beginning of a speaking turn during a social interaction. The present study offers insight into the temporal dynamics of live dyadic interactions and also provides a new method of analysis for eye gaze data when temporal relationships are of interest

    Road users rarely use explicit communication when interacting in today’s traffic: Implications for Automated Vehicles

    Get PDF
    To be successful, automated vehicles (AVs) need to be able to manoeuvre in mixed traffic in a way that will be accepted by road users, and maximises traffic safety and efficiency. A likely prerequisite for this success is for AVs to be able to communicate effectively with other road users in a complex traffic environment. The current study, conducted as part of the European project interACT, investigates the communication strategies used by drivers and pedestrians while crossing the road at six observed locations, across three European countries. In total, 701 road user interactions were observed and annotated, using an observation protocol developed for this purpose. The observation protocols identified 20 event categories, observed from the approaching vehicles/drivers and pedestrians. These included information about movement, looking behaviour, hand gestures, and signals used, as well as some demographic data. These observations illustrated that explicit communication techniques, such as honking, flashing headlights by drivers, or hand gestures by drivers and pedestrians, rarely occurred. This observation was consistent across sites. In addition, a follow-on questionnaire, administered to a sub-set of the observed pedestrians after crossing the road, found that when contemplating a crossing, pedestrians were more likely to use vehicle-based behaviour, rather than communication cues from the driver. Overall, the findings suggest that vehicle-based movement information such as yielding cues are more likely to be used by pedestrians while crossing the road, compared to explicit communication cues from drivers, although some cultural differences were observed. The implications of these findings are discussed with respect to design of suitable external interfaces and communication of intent by future automated vehicles

    What Affects Social Attention? Social Presence, Eye Contact and Autistic Traits

    Get PDF
    Social understanding is facilitated by effectively attending to other people and the subtle social cues they generate. In order to more fully appreciate the nature of social attention and what drives people to attend to social aspects of the world, one must investigate the factors that influence social attention. This is especially important when attempting to create models of disordered social attention, e.g. a model of social attention in autism. Here we analysed participants' viewing behaviour during one-to-one social interactions with an experimenter. Interactions were conducted either live or via video (social presence manipulation). The participant was asked and then required to answer questions. Experimenter eye-contact was either direct or averted. Additionally, the influence of participant self-reported autistic traits was also investigated. We found that regardless of whether the interaction was conducted live or via a video, participants frequently looked at the experimenter's face, and they did this more often when being asked a question than when answering. Critical differences in social attention between the live and video interactions were also observed. Modifications of experimenter eye contact influenced participants' eye movements in the live interaction only; and increased autistic traits were associated with less looking at the experimenter for video interactions only. We conclude that analysing patterns of eye-movements in response to strictly controlled video stimuli and natural real-world stimuli furthers the field's understanding of the factors that influence social attention

    Attention to Speech-Accompanying Gestures: Eye Movements and Information Uptake

    Get PDF
    There is growing evidence that addressees in interaction integrate the semantic information conveyed by speakers’ gestures. Little is known, however, about whether and how addressees’ attention to gestures and the integration of gestural information can be modulated. This study examines the influence of a social factor (speakers’ gaze to their own gestures), and two physical factors (the gesture’s location in gesture space and gestural holds) on addressees’ overt visual attention to gestures (direct fixations of gestures) and their uptake of gestural information. It also examines the relationship between gaze and uptake. The results indicate that addressees’ overt visual attention to gestures is affected both by speakers’ gaze and holds but for different reasons, whereas location in space plays no role. Addressees’ uptake of gesture information is only influenced by speakers’ gaze. There is little evidence of a direct relationship between addressees’ direct fixations of gestures and their uptake

    Resource security impacts men’s female breast size preferences

    Get PDF
    It has been suggested human female breast size may act as signal of fat reserves, which in turn indicates access to resources. Based on this perspective, two studies were conducted to test the hypothesis that men experiencing relative resource insecurity should perceive larger breast size as more physically attractive than men experiencing resource security. In Study 1, 266 men from three sites in Malaysia varying in relative socioeconomic status (high to low) rated a series of animated figures varying in breast size for physical attractiveness. Results showed that men from the low socioeconomic context rated larger breasts as more attractive than did men from the medium socioeconomic context, who in turn perceived larger breasts as attractive than men from a high socioeconomic context. Study 2 compared the breast size judgements of 66 hungry versus 58 satiated men within the same environmental context in Britain. Results showed that hungry men rated larger breasts as significantly more attractive than satiated men. Taken together, these studies provide evidence that resource security impacts upon men’s attractiveness ratings based on women’s breast size

    Men’s oppressive beliefs predict their breast size preferences in women

    Get PDF
    Previous studies of men’s breast size preferences have yielded equivocal findings, with studies variously indicating a preference for small, medium, or large breasts. Here, we examined the impact of men’s oppressive beliefs in shaping their female breast size ideals. British White men from the community in London, England (N = 361) viewed figures of women that rotated in 360° and varied in breast size along five levels. They then rated the figure that they found most physically attractive and also completed measures assessing their sexist attitudes and tendency to objectify women. Results showed that medium breasts were rated most frequent as attractive (32.7 %), followed by large (24.4 %) and very large (19.1 %) breasts. Further analyses showed that men’s preferences for larger female breasts were significantly associated with a greater tendency to be benevolently sexist, to objectify women, and to be hostile towards women. These results were discussed in relation to feminist theories, which postulate that beauty ideals and practices in contemporary societies serve to maintain the domination of one sex over the other

    Cues for Early Social Skills: Direct Gaze Modulates Newborns' Recognition of Talking Faces

    Get PDF
    Previous studies showed that, from birth, speech and eye gaze are two important cues in guiding early face processing and social cognition. These studies tested the role of each cue independently; however, infants normally perceive speech and eye gaze together. Using a familiarization-test procedure, we first familiarized newborn infants (n = 24) with videos of unfamiliar talking faces with either direct gaze or averted gaze. Newborns were then tested with photographs of the previously seen face and of a new one. The newborns looked longer at the face that previously talked to them, but only in the direct gaze condition. These results highlight the importance of both speech and eye gaze as socio-communicative cues by which infants identify others. They suggest that gaze and infant-directed speech, experienced together, are powerful cues for the development of early social skills
    corecore