4 research outputs found

    Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions

    Get PDF
    Autistic people often experience difficulties navigating face-to-face social interactions. Historically, the empirical literature has characterised these difficulties as cognitive ‘deficits’ in social information processing. However, the empirical basis for such claims is lacking, with most studies failing to capture the complexity of social interactions, often distilling them into singular communicative modalities (e.g. gaze-based communication) that are rarely used in isolation in daily interactions. The current study examined how gaze was used in concert with communicative hand gestures during joint attention interactions. We employed an immersive virtual reality paradigm, where autistic (n = 22) and non-autistic (n = 22) young people completed a collaborative task with a non-autistic confederate. Integrated eye-, head- and hand-motion-tracking enabled dyads to communicate naturally with each other while offering objective measures of attention and behaviour. Autistic people in our sample were similarly, if not more, effective in responding to hand-cued joint attention bids compared with non-autistic people. Moreover, both autistic and non-autistic people demonstrated an ability to adaptively use gaze information to aid coordination. Our findings suggest that the intersecting fields of autism and social neuroscience research may have overstated the role of eye gaze during coordinated social interactions. Lay abstract: Autistic people have been said to have ‘problems’ with joint attention, that is, looking where someone else is looking. Past studies of joint attention have used tasks that require autistic people to continuously look at and respond to eye-gaze cues. But joint attention can also be done using other social cues, like pointing. This study looked at whether autistic and non-autistic young people use another person’s eye gaze during joint attention in a task that did not require them to look at their partner’s face. In the task, each participant worked together with their partner to find a computer-generated object in virtual reality. Sometimes the participant had to help guide their partner to the object, and other times, they followed their partner’s lead. Participants were told to point to guide one another but were not told to use eye gaze. Both autistic and non-autistic participants often looked at their partner’s face during joint attention interactions and were faster to respond to their partner’s hand-pointing when the partner also looked at the object before pointing. This shows that autistic people can and do use information from another person’s eyes, even when they don’t have to. It is possible that, by not forcing autistic young people to look at their partner’s face and eyes, they were better able to gather information from their partner’s face when needed, without being overwhelmed. This shows how important it is to design tasks that provide autistic people with opportunities to show what they can do

    sj-docx-2-aut-10.1177_13623613231211967 – Supplemental material for Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions

    No full text
    Supplemental material, sj-docx-2-aut-10.1177_13623613231211967 for Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions by Nathan Caruana, Patrick Nalepka, Glicyr A Perez, Christine Inkley, Courtney Munro, Hannah Rapaport, Simon Brett, David M Kaplan, Michael J Richardson and Elizabeth Pellicano in Autism</p

    sj-docx-1-aut-10.1177_13623613231211967 – Supplemental material for Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions

    No full text
    Supplemental material, sj-docx-1-aut-10.1177_13623613231211967 for Autistic young people adaptively use gaze to facilitate joint attention during multi-gestural dyadic interactions by Nathan Caruana, Patrick Nalepka, Glicyr A Perez, Christine Inkley, Courtney Munro, Hannah Rapaport, Simon Brett, David M Kaplan, Michael J Richardson and Elizabeth Pellicano in Autism</p
    corecore