142 research outputs found
No Own-Age Bias in Childrenās Gaze-Cueing Effects
Sensitivity to another personās eye gaze is vital for social and language development. In this eye-tracking study, a group of 74 children (6ā14 years old) performed a gaze-cueing experiment in which another personās shift in eye gaze potentially cued the location of a peripheral target. The aim of the present study is to investigate whether childrenās gaze-cueing effects are modulated by the other personās age. In half of the trials, the gaze cue was given by adult models, in the other half of the trials by child models. Regardless of the modelsā ages, children displayed an overall gaze-cueing effect. However, results showed no indication of an own-age bias in the performance on the gaze-cueing task; the gaze-cueing effect is similar for both child and adult face cues. These results did not change when we looked at the performance of a subsample of participants (nĀ =Ā 23) who closely matched the age of the child models. Our results do not allow us to disentangle the possibility that children are insensitive to a modelās age or whether they consider models of either age as equally informative. Future research should aim at trying to disentangle these two possibilities
Brief Report: Eye Movements During Visual Search Tasks Indicate Enhanced Stimulus Discriminability in Subjects with PDD
Subjects with PDD excel on certain visuo-spatial tasks, amongst which visual search tasks, and this has been attributed to enhanced perceptual discrimination. However, an alternative explanation is that subjects with PDD show a different, more effective search strategy. The present study aimed to test both hypotheses, by measuring eye movements during visual search tasks in high functioning adult men with PDD and a control group. Subjects with PDD were significantly faster than controls in these tasks, replicating earlier findings in children. Eye movement data showed that subjects with PDD made fewer eye movements than controls. No evidence was found for a different search strategy between the groups. The data indicate an enhanced ability to discriminate between stimulus elements in PDD
Block design reconstruction skills: not a good candidate for an endophenotypic marker in autism research
Superior performance on block design tasks is reported in autistic individuals, although it is not consistently found in high-functioning individuals or individuals with Asperger Syndrome. It is assumed to reflect weak central coherence: an underlying cognitive deficit, which might also be part of the genetic makeup of the disorder. We assessed block design reconstruction skills in high-functioning individuals with autism spectrum disorders (ASD) from multi-incidence families and in their parents. Performance was compared to relevant matched control groups. We used a task that was assumed to be highly sensitive to subtle performance differences. We did not find individuals with ASD to be significantly faster on this task than the matched control group, not even when the difference between reconstruction time of segmented and pre-segmented designs was compared. However, we found individuals with ASD to make fewer errors during the process of reconstruction which might indicate some dexterity in mental segmentation. However, parents of individuals with ASD did not perform better on the task than control parents. Therefore, based on our data, we conclude that mental segmentation ability as measured with a block design reconstruction task is not a neurocognitive marker or endophenotype useful in genetic studies
Brain Responses to Faces and Facial Expressions in 5-Month-Olds: An fNIRS Study
Processing faces and understanding facial expressions are crucial skills for social communication. In adults, basic face processing and facial emotion processing rely on specific interacting brain networks. In infancy, however, little is known about when and how these networks develop. The current study uses functional near-infrared spectroscopy (fNIRS) to measure differences in 5-month-oldsā brain activity in response to fearful and happy facial expressions. Our results show that the right occipital region responds to faces, indicating that the face processing network is activated at 5 months. Yet sensitivity to facial emotions appears to be still immature at this age: explorative analyses suggest that if the facial emotion processing network was active this would be mainly visible in the temporal cortex. Together these results indicate that at 5 months, occipital areas already show sensitivity to face processing, while the facial emotion processing network seems not fully developed
Noise-robust fixation detection in eye movement data: Identification by two-means clustering (I2MC)
Eye-tracking research in infants and older children has gained a lot of momentum over the last decades. Although eye-tracking research in these participant groups has become easier with the advance of the remote eye-tracker, this often comes at the cost of poorer data quality than in research with well-trained adults (Hessels, Andersson, Hooge, Nystrƶm, & Kemner Infancy, 20, 601ā633, 2015; Wass, Forssman, & LeppƤnen Infancy, 19, 427ā460, 2014). Current fixation detection algorithms are not built for data from infants and young children. As a result, some researchers have even turned to hand correction of fixation detections (Saez de Urabain, Johnson, & Smith Behavior Research Methods, 47, 53ā72, 2015). Here we introduce a fixation detection algorithmāidentification by two-means clustering (I2MC)ābuilt specifically for data across a wide range of noise levels and when periods of data loss may occur. We evaluated the I2MC algorithm against seven state-of-the-art event detection algorithms, and report that the I2MC algorithmās output is the most robust to high noise and data loss levels. The algorithm is automatic, works offline, and is suitable for eye-tracking data recorded with remote or tower-mounted eye-trackers using static stimuli. In addition to application of the I2MC algorithm in eye-tracking research with infants, school children, and certain patient groups, the I2MC algorithm also may be useful when the noise and data loss levels are markedly different between trials, participants, or time points (e.g., longitudinal research)
Multisensory Integration and Attention in Autism Spectrum Disorder: Evidence from Event-Related Potentials
Abstract Successful integration of various simultaneously perceived perceptual signals is crucial for social behavior. Recent findings indicate that this multisensory integration (MSI) can be modulated by attention. Theories of Autism Spectrum Disorders (ASDs) suggest that MSI is affected in this population while it remains unclear to what extent this is related to impairments in attentional capacity. In the present study Event-related potentials (ERPs) following emotionally congruent and incongruent face-voice pairs were measured in 23 high-functioning, adult ASD individuals and 24 age-and IQ-matched controls. MSI was studied while the attention of the participants was manipulated. ERPs were measured at typical auditory and visual processing peaks, namely, P2 and N170. While controls showed MSI during divided attention and easy selective attention tasks, individuals with ASD showed MSI during easy selective attention tasks only. It was concluded that individuals with ASD are able to process multisensory emotional stimuli, but this is differently modulated by attention mechanisms in these participants, especially those associated with divided attention. This atypical interaction between attention and MSI is also relevant to treatment strategies, with training of multisensory attentional control possibly being more beneficial than conventional sensory integration therapy
Two-year-olds at elevated risk for ASD can learn novel words from their parents
Children diagnosed with autism spectrum disorder (ASD) often have smaller vocabularies in infancy compared to typically-developing children. To understand whether their smaller vocabularies stem from problems in learning, our study compared a prospective risk sample of 18 elevated risk and 11 lower risk 24-month-olds on current vocabulary size and word learning ability using a paradigm in which parents teach their child words. Results revealed that both groups learned novel words, even though parents indicated that infants at elevated risk of ASD knew fewer words. This suggests that these early compromised vocabularies cannot be solely linked to difficulties in word formations
The direct and indirect effects of parenting behaviors and functional brain network efficiency on self-regulation from infancy to early childhood: A longitudinal mediation model
There is growing interest in the hypothesis that early parenting behaviors impact childrenās self-regulation by affecting childrenās developing brain networks. Yet, most prior research on the development of self-regulation has focused on either environmental or neurobiological factors. The aim of the current study was to expand the literature by examining direct and indirect effects of variations in parenting behaviors (support and stimulation) and efficiency of functional brain networks (small-worldness) on individual differences in child self-regulation, using a three-wave longitudinal model in a sample of 109 infants and their mothers. Results revealed that parental support predicted child self-regulation at 5 months, 10 months, and 3 years of age. This effect was not mediated by infantsā small-worldness within the alpha and theta rhythm. Parental stimulation predicted higher levels of infantsā alpha small-worldness, whereas parental support predicted lower levels of infantsā theta small-worldness. Thus, parents may need to stimulate their infants to explore the environment autonomously in order to come to more efficient functional brain networks. The findings of the current study highlight potential influences of both extrinsic environmental factors and intrinsic neurobiological factors in relation to child self-regulation, emphasizing the role of parental support as a form of external regulation during infancy, when the brain is not yet sufficiently developed to perform self-regulation itself
Gaze and speech behavior in parentāchild interactions: The role of conflict and cooperation
A primary mode of human social behavior is face-to-face interaction. In this study, we investigated the characteristics of gaze and its relation to speech behavior during video-mediated face-to-face interactions between parents and their preadolescent children. 81 parentāchild dyads engaged in conversations about cooperative and conflictive family topics. We used a dual-eye tracking setup that is capable of concurrently recording eye movements, frontal video, and audio from two conversational partners. Our results show that children spoke more in the cooperation-scenario whereas parents spoke more in the conflict-scenario. Parents gazed slightly more at the eyes of their children in the conflict-scenario compared to the cooperation-scenario. Both parents and children looked more at the other's mouth region while listening compared to while speaking. Results are discussed in terms of the role that parents and children take during cooperative and conflictive interactions and how gaze behavior may support and coordinate such interactions
- ā¦