68 research outputs found

    Feasibility of undertaking off-site infant eye-tracking assessments of neuro-cognitive functioning in early-intervention centres

    Get PDF
    Recent work suggests that differences in functional brain development are already identifiable in 6- to 9-month-old infants from low socio-economic status (SES) backgrounds. Investigation of early SES-related differences in neuro-cognitive functioning requires the recruitment of large and diverse samples of infants, yet it is often difficult to persuade low-SES parents to come to a university setting. One solution is to recruit infants through early intervention children’s centres (CCs). These are often located in areas of high relative deprivation to support young children. Given the increasing portability of eye-tracking equipment, assessment of large clusters of infants could be undertaken in centres by suitably trained early intervention staff. Here we report on a study involving 174 infants and their parents, carried out in partnership with CCs, exploring the feasibility of this approach, We report the processes of setting up the project and participant recruitment. We report the diversity of sample obtained on the engagement of CC staff in training and the process of assessment itself. We report the quality of the data obtained, and the levels of engagement of parents, and infants. We conclude that this approach has great potential for recruiting large and diverse samples worldwide, provides sufficiently reliable data, and is engaging to staff, parents and infants

    Efficiency of scanning and attention to faces in infancy independently predict language development in a multiethnic and bilingual sample of 2-year-olds

    Get PDF
    Efficient visual exploration in infancy is essential for cognitive and language development. It allows infants to participate in social interactions by attending to faces and learning about objects of interest. Visual scanning of scenes depends on a number of factors and early differences in efficiency are likely contributing to differences in learning and language development during subsequent years. Predicting language development in diverse samples is particularly challenging, as additional multiple sources of variability affect infant performance. In this study we tested how the complexity of visual scanning in the presence or absence of a face at 6-7 months of age is related to language development at 2 years of age in a multi-ethnic and predominantly bilingual sample from diverse socio-economic backgrounds. We used Recurrence Quantification Analysis to measure the temporal and spatial distribution of fixations recurring in the same area of a visual scene. We found that in the absence of a face the temporal distribution of re-fixations on selected objects of interest (but not all) significantly predicted both receptive and expressive language scores, explaining 16 - 20% of the variance. Also, lower rate of re-fixations recurring in the presence of a face predicted higher receptive language scores, suggesting larger vocabulary in infants that effectively disengage from faces. Altogether, our results suggest that dynamic measures, which quantify the complexity of visual scanning can reliably and robustly predict language development in highly diverse samples. They suggest that selective attending to objects predicts language independently of attention to faces. As eye-tracking and language assessments were carried out in early intervention centres, our study demonstrates the utility of mobile eye-tracking setups for early detection of risk in attention and language development

    Beyond fixation durations: Recurrence quantification analysis reveals spatiotemporal dynamics of infant visual scanning

    Get PDF
    Standard looking-duration measures in eye-tracking data provide only general quantitative indices, while details of the spatiotemporal structuring of fixation sequences are lost. To overcome this, various tools have been developed to measure the dynamics of fixations. However, these analyses are only useful when stimuli have high perceptual similarity and they require the previous definition of areas of interest (AOIs). Although these methods have been widely applied in adult studies, relatively little is known about the temporal structuring of infant gaze-foraging behaviors such as variability of scanning over time or individual scanning patterns. Thus, to shed more light on the spatiotemporal characteristics of infant fixation sequences we apply for the first time a new methodology for nonlinear time-series analysis—the recurrence quantification analysis (RQA). We present how the dynamics of infant scanning varies depending on the scene content during a "pop-out" search task. Moreover, we show how the normalization of RQA measures with average fixation durations provides a more detailed account of the dynamics of fixation sequences. Finally, we link the RQA measures of temporal dynamics of scanning with the spatial information about the stimuli using heat maps of recurrences without the need for defining a priori AOIs and present how infants’ foraging strategies are driven by the image content. We conclude from our findings that the RQA methodology has potential applications in the analysis of the temporal dynamics of infant visual foraging offering advantages over existing methods

    Differential habituation to repeated sounds in infants at high risk for autism

    Get PDF
    It has been suggested that poor habituation to stimuli might explain atypical sensory behaviours in autism, i.e. over-responsiveness to some stimuli and under-sensitivity to other. We investigated habituation to repeated sounds using an oddball paradigm in 9 month-old infants with an older sibling with autism and hence at high risk for developing autism. Auditory evoked responses to repeated sounds in control infants (at low risk of developing autism) decreased over time, demonstrating habituation, and their responses to deviant sounds were larger than responses to standard sounds, indicating discrimination. In contrast, neural responses in infants at high risk showed no habituation, and reduced sensitivity to changes in frequency. Reduced sensory habituation may be present at a younger age than the emergence of autistic behaviour in some individuals, and we propose that this could play a role in the sensory atypicalities observed in autism

    Brain responses and looking behavior during audiovisual speech integration in infants predict auditory speech comprehension in the second year of life

    Get PDF
    The use of visual cues during the processing of audiovisual (AV) speech is known to be less efficient in children and adults with language difficulties and difficulties are known to be more prevalent in children from low-income populations. In the present study, we followed an economically diverse group of thirty-seven infants longitudinally from 6–9 months to 14–16 months of age. We used eye-tracking to examine whether individual differences in visual attention during AV processing of speech in 6–9 month old infants, particularly when processing congruent and incongruent auditory and visual speech cues, might be indicative of their later language development. Twenty-two of these 6–9 month old infants also participated in an event-related potential (ERP) AV task within the same experimental session. Language development was then followed-up at the age of 14–16 months, using two measures of language development, the Preschool Language Scale and the Oxford Communicative Development Inventory. The results show that those infants who were less efficient in auditory speech processing at the age of 6–9 months had lower receptive language scores at 14–16 months. A correlational analysis revealed that the pattern of face scanning and ERP responses to audiovisually incongruent stimuli at 6–9 months were both significantly associated with language development at 14–16 months. These findings add to the understanding of individual differences in neural signatures of AV processing and associated looking behavior in infants

    Exploring early developmental changes in face scanning patterns during the perception of audiovisual mismatch of speech cues

    Get PDF
    Young infants are capable of integrating auditory and visual information and their speech perception can be influenced by visual cues, while 5-month-olds detect mismatch between mouth articulations and speech sounds. From 6 months of age, infants gradually shift their attention away from eyes and towards the mouth in articulating faces, potentially to benefit from intersensory redundancy of audiovisual (AV) cues. Using eye tracking, we investigated whether 6- to 9-month-olds showed a similar age-related increase of looking to the mouth, while observing congruent and/or redundant versus mismatched and non-redundant speech cues. Participants distinguished between congruent and incongruent AV cues as reflected by the amount of looking to the mouth. They showed an age-related increase in attention to the mouth, but only for non-redundant, mismatched AV speech cues. Our results highlight the role of intersensory redundancy and audiovisual mismatch mechanisms in facilitating the development of speech processing in infants under 12 months of age

    Atypical audiovisual speech integration in infants at risk for autism

    Get PDF
    The language difficulties often seen in individuals with autism might stem from an inability to integrate audiovisual information, a skill important for language development. We investigated whether 9-month-old siblings of older children with autism, who are at an increased risk of developing autism, are able to integrate audiovisual speech cues. We used an eye-tracker to record where infants looked when shown a screen displaying two faces of the same model, where one face is articulating/ba/and the other/ga/, with one face congruent with the syllable sound being presented simultaneously, the other face incongruent. This method was successful in showing that infants at low risk can integrate audiovisual speech: they looked for the same amount of time at the mouths in both the fusible visual/ga/− audio/ba/and the congruent visual/ba/− audio/ba/displays, indicating that the auditory and visual streams fuse into a McGurk-type of syllabic percept in the incongruent condition. It also showed that low-risk infants could perceive a mismatch between auditory and visual cues: they looked longer at the mouth in the mismatched, non-fusible visual/ba/− audio/ga/display compared with the congruent visual/ga/− audio/ga/display, demonstrating that they perceive an uncommon, and therefore interesting, speech-like percept when looking at the incongruent mouth (repeated ANOVA: displays x fusion/mismatch conditions interaction: F(1,16) = 17.153, p = 0.001). The looking behaviour of high-risk infants did not differ according to the type of display, suggesting difficulties in matching auditory and visual information (repeated ANOVA, displays x conditions interaction: F(1,25) = 0.09, p = 0.767), in contrast to low-risk infants (repeated ANOVA: displays x conditions x low/high-risk groups interaction: F(1,41) = 4.466, p = 0.041). In some cases this reduced ability might lead to the poor communication skills characteristic of autism

    Socioeconomic status and functional brain development - associations in early infancy

    Get PDF
    Socioeconomic status (SES) impacts on both structural and functional brain development in childhood, but how early its effects can be demonstrated is unknown. In this study we measured resting baseline EEG activity in the gamma frequency range in awake 6–9-month-olds from areas of East London with high socioeconomic deprivation. Between-subject comparisons of infants from low- and high-income families revealed significantly lower frontal gamma power in infants from low-income homes. Similar power differences were found when comparing infants according to maternal occupation, with lower occupational status groups yielding lower power. Infant sleep, maternal education, length of gestation, and birth weight, as well as smoke exposure and bilingualism, did not explain these differences. Our results show that the effects of socioeconomic disparities on brain activity can already be detected in early infancy, potentially pointing to very early risk for language and attention difficulties. This is the first study to reveal region-selective differences in functional brain development associated with early infancy in low-income families

    Functional near infrared spectroscopy (fNIRS) to assess cognitive function in infants in rural Africa

    Get PDF
    Cortical mapping of cognitive function during infancy is poorly understood in low-income countries due to the lack of transportable neuroimaging methods. We have successfully piloted functional near infrared spectroscopy (fNIRS) as a neuroimaging tool in rural Gambia. Four-to-eight month old infants watched videos of Gambian adults perform social movements, while haemodynamic responses were recorded using fNIRS. We found distinct regions of the posterior superior temporal and inferior frontal cortex that evidenced either visual-social activation or vocally selective activation (vocal > non-vocal). The patterns of selective cortical activation in Gambian infants replicated those observed within similar aged infants in the UK. These are the first reported data on the measurement of localized functional brain activity in young infants in Africa and demonstrate the potential that fNIRS offers for field-based neuroimaging research of cognitive function in resource-poor rural communities
    • …
    corecore