57 research outputs found

    Inferring emotion without language: Comparing canines and prelinguistic infants

    Get PDF
    Research on canine emotions has to deal with challenges quite similar to psychological research on social and emotional development in human infants. In both cases, verbal reports are unattainable, and behavioral and physiological methods have to be adjusted to the specific population. I will argue that both regarding empirical approaches and conceptual work, advances in research on social-cognitive development in human infants can inform the study of canine emotions

    Young Infants' Neural Processing of Objects Is Affected by Eye Gaze Direction and Emotional Expression

    Get PDF
    Eye gaze is an important social cue which is used to determine another person's focus of attention and intention to communicate. In combination with a fearful facial expression eye gaze can also signal threat in the environment. The ability to detect and understand others' social signals is essential in order to avoid danger and enable social evaluation. It has been a matter of debate when infants are able to use gaze cues and emotional facial expressions in reference to external objects. Here we demonstrate that by 3 months of age the infant brain differentially responds to objects as a function of how other people are reacting to them. Using event-related electrical brain potentials (ERPs), we show that an indicator of infants' attention is enhanced by an adult's expression of fear toward an unfamiliar object. The infant brain showed an increased Negative central (Nc) component toward objects that had been previously cued by an adult's eye gaze and frightened facial expression. Our results further suggest that infants' sensitivity cannot be due to a general arousal elicited by a frightened face with eye gaze directed at an object. The neural attention system of 3 month old infants is sensitive to an adult's eye gaze direction in combination with a fearful expression. This early capacity may lay the foundation for the development of more sophisticated social skills such as social referencing, language, and theory of mind

    Communicative signals during joint attention promote neural processes of infants and caregivers

    Get PDF
    Communicative signals such as eye contact increase infants’ brain activation to visual stimuli and promote joint attention. Our study assessed whether communicative signals during joint attention enhance infant-caregiver dyads’ neural responses to objects, and their neural synchrony. To track mutual attention processes, we applied rhythmic visual stimulation (RVS), presenting images of objects to 12-month-old infants and their mothers (n = 37 dyads), while we recorded dyads’ brain activity (i.e., steady-state visual evoked potentials, SSVEPs) with electroencephalography (EEG) hyperscanning. Within dyads, mothers either communicatively showed the images to their infant or watched the images without communicative engagement. Communicative cues increased infants’ and mothers’ SSVEPs at central-occipital-parietal, and central electrode sites, respectively. Infants showed significantly more gaze behaviour to images during communicative engagement. Dyadic neural synchrony (SSVEP amplitude envelope correlations, AECs) was not modulated by communicative cues. Taken together, maternal communicative cues in joint attention increase infants’ neural responses to objects, and shape mothers’ own attention processes. We show that communicative cues enhance cortical visual processing, thus play an essential role in social learning. Future studies need to elucidate the effect of communicative cues on neural synchrony during joint attention. Finally, our study introduces RVS to study infant-caregiver neural dynamics in social contexts

    Corrigendum: Reduced Mu Power in Response to Unusual Actions Is Context-Dependent in 1-Year-Olds

    Get PDF
    During social interactions infants predict and evaluate other people’s actions. Previous behavioral research found that infants’ imitation of others’ actions depends on these evaluations and is context-dependent: 1-year-olds predominantly imitated an unusual action (turning on a lamp with one’s forehead) when the model’s hands were free compared to when the model’s hands were occupied or restrained. In the present study, we adapted this behavioral paradigm to a neurophysiological study measuring infants’ brain activity while observing usual and unusual actions via electroencephalography. In particular, we measured differences in mu power (6 – 8 Hz) associated with motor activation. In a between-subjects design, 12- to 14-month-old infants watched videos of adult models demonstrating that their hands were either free or restrained. Subsequent test frames showed the models turning on a lamp or a soundbox by using their head or their hand. Results in the hands-free condition revealed that 12- to 14-month-olds displayed a reduction of mu power in frontal regions in response to unusual and thus unexpected actions (head touch) compared to usual and expected actions (hand touch). This may be explained by increased motor activation required for updating prior action predictions in response to unusual actions though alternative explanations in terms of general attention or cognitive control processes may also be considered. In the hands-restrained condition, responses in mu frequency band did not differ between action outcomes. This implies that unusual head-touch actions compared to hand-touch actions do not necessarily evoke a reduction of mu power. Thus, we conclude that reduction of mu frequency power is context-dependent during infants’ action perception. Our results are interpreted in terms of motor system activity measured via changes in mu frequency band as being one important neural mechanism involved in action prediction and evaluation from early on

    Young Infants\u27 Neural Processing of Objects Is Affected by Eye Gaze Direction and Emotional Expression

    Full text link
    Eye gaze is an important social cue which is used to determine another person\u27s focus of attention and intention to communicate. In combination with a fearful facial expression eye gaze can also signal threat in the environment. The ability to detect and understand others\u27 social signals is essential in order to avoid danger and enable social evaluation. It has been a matter of debate when infants are able to use gaze cues and emotional facial expressions in reference to external objects. Here we demonstrate that by 3 months of age the infant brain differentially responds to objects as a function of how other people are reacting to them. Using event-related electrical brain potentials (ERPs), we show that an indicator of infants\u27 attention is enhanced by an adult\u27s expression of fear toward an unfamiliar object. The infant brain showed an increased Negative central (Nc) component toward objects that had been previously cued by an adult\u27s eye gaze and frightened facial expression. Our results further suggest that infants\u27 sensitivity cannot be due to a general arousal elicited by a frightened face with eye gaze directed at an object. The neural attention system of 3 month old infants is sensitive to an adult\u27s eye gaze direction in combination with a fearful expression. This early capacity may lay the foundation for the development of more sophisticated social skills such as social referencing, language, and theory of mind

    A Guide to Parent-Child fNIRS Hyperscanning Data Processing and Analysis

    Get PDF
    The use of functional near-infrared spectroscopy (fNIRS) hyperscanning during naturalistic interactions in parent–child dyads has substantially advanced our understanding of the neurobiological underpinnings of human social interaction. However, despite the rise of developmental hyperscanning studies over the last years, analysis procedures have not yet been standardized and are often individually developed by each research team. This article offers a guide on parent–child fNIRS hyperscanning data analysis in MATLAB and R. We provide an example dataset of 20 dyads assessed during a cooperative versus individual problem-solving task, with brain signal acquired using 16 channels located over bilateral frontal and temporo-parietal areas. We use MATLAB toolboxes Homer2 and SPM for fNIRS to preprocess the acquired brain signal data and suggest a standardized procedure. Next, we calculate interpersonal neural synchrony between dyads using Wavelet Transform Coherence (WTC) and illustrate how to run a random pair analysis to control for spurious correlations in the signal. We then use RStudio to estimate Generalized Linear Mixed Models (GLMM) to account for the bounded distribution of coherence values for interpersonal neural synchrony analyses. With this guide, we hope to offer advice for future parent–child fNIRS hyperscanning investigations and to enhance replicability within the field

    Theta- and alpha-band EEG activity in response to eye gaze cues in early infancy

    Get PDF
    In order to elucidate the development of how infants use eye gaze as a referential cue, we investigated theta and alpha oscillations in response to object-directed and object-averted eye gaze in infants aged 2, 4, 5, and 9 months. At 2 months of age, no difference between conditions was found. In 4- and 9-month-olds, alpha-band activity desynchronized more in response to faces looking at objects compared to faces looking away from objects. Theta activity in 5-month-old infants differed between conditions with more theta synchronization for object-averted eye gaze. Whereas alpha desynchronization might reflect mechanisms of early social object learning, theta is proposed to imply activity in the executive attention network. The interplay between alpha and theta activity represents developmental changes in both kinds of processes during early infancy

    The effects of interaction quality on neural synchrony during mother-child problem solving.

    Get PDF
    Understanding others is fundamental to interpersonal coordination and successful cooperation. One mechanism posited to underlie both effective communication and behavioral coordination is interpersonal neural synchrony. Although presumably foundational for children's social development, research on neural synchrony in naturalistic caregiver-child interactions is lacking. Using dual-functional near-infrared spectroscopy (fNIRS), we examined the effects of interaction quality on neural synchrony during a problem-solving task in 42 dyads of mothers and their preschool children. In a cooperation condition, mothers and children were instructed to solve a tangram puzzle together. In an individual condition, mothers and children performed the same task alone with an opaque screen between them. Wavelet transform coherence (WTC) was used to assess the cross-correlation between the two fNIRS time series. Results revealed increased neural synchrony in bilateral prefrontal cortex and temporo-parietal areas during cooperative as compared to individual problem solving. Higher neural synchrony during cooperation correlated with higher behavioral reciprocity and neural synchrony predicted the dyad's problem-solving success beyond reciprocal behavior between mothers and children. State-like factors, such as maternal stress and child agency during the task, played a bigger role for neural synchronization than trait-like factors, such as child temperament. Our results emphasize neural synchrony as a biomarker for mother-child interaction quality. These findings further highlight the role of state-like factors in interpersonal synchronization processes linked to successful coordination with others and in the long-term might improve the understanding of others

    Using Augmented Reality Toward Improving Social Skills:Scoping Review

    Get PDF
    BackgroundAugmented reality (AR) has emerged as a promising technology in educational settings owing to its engaging nature. However, apart from applications aimed at the autism spectrum disorder population, the potential of AR in social-emotional learning has received less attention. ObjectiveThis scoping review aims to map the range of AR applications that improve social skills and map the characteristics of such applications. MethodsIn total, 2 independent researchers screened 2748 records derived from 3 databases in December 2021—PubMed, IEEE Xplore, and ACM Guide to Computing Literature. In addition, the reference lists of all the included records and existing reviews were screened. Records that had developed a prototype with the main outcome of improving social skills were included in the scoping review. Included records were narratively described for their content regarding AR and social skills, their target populations, and their outcomes. Evaluation studies were assessed for methodological quality. ResultsA total of 17 records met the inclusion criteria for this study. Overall, 10 records describe applications for children with autism, primarily teaching about reading emotions in facial expressions; 7 records describe applications for a general population, targeting both children and adults, with a diverse range of outcome goals. The methodological quality of evaluation studies was found to be weak. ConclusionsMost applications are designed to be used alone, although AR is well suited to facilitating real-world interactions during a digital experience, including interactions with other people. Therefore, future AR applications could endorse social skills in a general population in more complex group settings

    The role of social signals in segmenting observed actions in eighteen-month-old children.

    Get PDF
    Learning about actions requires children to identify the boundaries of an action and its units. Whereas some action units are easily identified, parents can support children's action learning by adjusting the presentation and using social signals. However, currently little is understood regarding how children use these signals to learn actions. In the current study we investigate the possibility that communicative signals are a particularly suitable cue for segmenting events. We investigated this hypothesis by presenting 18-month-old children (N = 60) with short action sequences consisting of toy animals either hopping or sliding across a board into a house, but interrupting this two-step sequence either (a) using an ostensive signal as a segmentation cue, (b) using a non-ostensive segmentation cue, and (c) without additional segmentation information between the actions. Marking the boundary using communicative signals increased children's imitation of the less salient sliding action. Imitation of the hopping action remained unaffected. Crucially, marking the boundary of both actions using a non-communicative control condition did not increase imitation of either action. Communicative signals might be particularly suitable in segmenting non-salient actions that would otherwise be perceived as part of another action or as non-intentional. These results provide evidence of the importance of ostensive signals at event boundaries in scaffolding children's learning
    • …
    corecore