1,572 research outputs found

    Data-driven approaches in the investigation of social perception

    Get PDF
    The complexity of social perception poses a challenge to traditional approaches to understand its psychological and neurobiological underpinnings. Data-driven methods are particularly well suited to tackling the often high-dimensional nature of stimulus spaces and of neural representations that characterize social perception. Such methods are more exploratory, capitalize on rich and large datasets, and attempt to discover patterns often without strict hypothesis testing. We present four case studies here: behavioural studies on face judgements, two neuroimaging studies of movies, and eyetracking studies in autism. We conclude with suggestions for particular topics that seem ripe for data-driven approaches, as well as caveats and limitations

    Effects of conversation content on viewing dyadic conversations

    Get PDF
    People typically follow conversations closely with their gaze. We asked whether this viewing is influenced by what is actually said in the conversation and by the viewer’s psychological condition. We recorded the eye movements of healthy (N = 16) and depressed (N = 25) participants while they were viewing video clips. Each video showed two people, each speaking one line of dialogue about socio-emotionally important (i.e., personal) or unimportant topics (matter-of-fact). Between the spoken lines, the viewers made more saccadic shifts between the discussants, and looked more at the second speaker, in personal vs. matter-of-fact conversations. Higher depression scores were correlated with less looking at the currently speaking discussant. We conclude that subtle social attention dynamics can be detected from eye movements and that these dynamics are sensitive to the observer’s psychological condition, such as depression

    Mental Action Simulation Synchronizes Action-Observation Circuits across Individuals

    Get PDF
    A frontoparietal action–observation network (AON) has been proposed to support understanding others' actions and goals. We show that the AON "ticks together" in human subjects who are sharing a third person's feelings. During functional magnetic resonance imaging, 20 volunteers watched movies depicting boxing matches passively or while simulating a prespecified boxer's feelings. Instantaneous intersubject phase synchronization (ISPS) was computed to derive multisubject voxelwise similarity of hemodynamic activity and inter-area functional connectivity. During passive viewing, subjects' brain activity was synchronized in sensory projection and posterior temporal cortices. Simulation induced widespread increase of ISPS in the AON (premotor, posterior parietal, and superior temporal cortices), primary and secondary somatosensory cortices, and the dorsal attention circuits (frontal eye fields, intraparietal sulcus). Moreover, interconnectivity of these regions strengthened during simulation. We propose that sharing a third person's feelings synchronizes the observer's own brain mechanisms supporting sensations and motor planning, thereby likely promoting mutual understanding.Peer reviewe

    What we observe is biased by what other people tell us: beliefs about the reliability of gaze behavior modulate attentional orienting to gaze cues

    Get PDF
    For effective social interactions with other people, information about the physical environment must be integrated with information about the interaction partner. In order to achieve this, processing of social information is guided by two components: a bottom-up mechanism reflexively triggered by stimulus-related information in the social scene and a top-down mechanism activated by task-related context information. In the present study, we investigated whether these components interact during attentional orienting to gaze direction. In particular, we examined whether the spatial specificity of gaze cueing is modulated by expectations about the reliability of gaze behavior. Expectations were either induced by instruction or could be derived from experience with displayed gaze behavior. Spatially specific cueing effects were observed with highly predictive gaze cues, but also when participants merely believed that actually non-predictive cues were highly predictive. Conversely, cueing effects for the whole gazed-at hemifield were observed with non-predictive gaze cues, and spatially specific cueing effects were attenuated when actually predictive gaze cues were believed to be non-predictive. This pattern indicates that (i) information about cue predictivity gained from sampling gaze behavior across social episodes can be incorporated in the attentional orienting to social cues, and that (ii) beliefs about gaze behavior modulate attentional orienting to gaze direction even when they contradict information available from social episodes

    Emotions in a Web-based Learning Environment

    Get PDF
    The aim of this thesis was to examine emotions in a web-based learning environment (WBLE). Theoretically, the thesis was grounded on the dimensional model of emotions. Four empirical studies were conducted. Study I focused on students’ anxiety and their self-efficacy in computer-using situations. Studies II and III examined the influence of experienced emotions on students’ collaborative visible and non-collaborative invisible activities and lurking in a WBLE. Study II also focused on the antecedents of the emotions students experience in a web-based learning environment. Study IV concentrated on clarifying the differences between emotions experienced in face-to-face and web-based collaborative learning. The results of these studies are reported in four original research articles published in scientific journals. The present studies demonstrate that emotions are important determinants of student behaviour in a web-based learning, and justify the conclusion that interactions on the web can and do have an emotional content. Based on the results of these empirical studies, it can be concluded that the emotions students experience during the web-based learning result mostly from the social interactions rather than from the technological context. The studies indicate that the technology itself is not the only antecedent of students’ emotional reactions in the collaborative web-based learning situations. However, the technology itself also exerted an influence on students’ behaviour. It was found that students’ computer anxiety was associated with their negative expectations of the consequences of using technology-based learning environments in their studies. Moreover, the results also indicated that student behaviours in a WBLE can be divided into three partially overlapping classes: i) collaborative visible ii) non-collaborative invisible activities, and iii) lurking. What is more, students’ emotions experienced during the web-based learning affected how actively they participated in such activities in the environment. Especially lurkers, i.e. students who seldom participated in discussions but frequently visited the online environment, experienced more negatively valenced emotions during the courses than did the other students. This result indicates that such negatively toned emotional experiences can make the lurking individuals less eager to participate in other WBLE courses in the future. Therefore, future research should also focus more precisely on the reasons that cause individuals to lurk in online learning groups, and the development of learning tasks that do not encourage or permit lurking or inactivity. Finally, the results from the study comparing emotional reactions in web-based and face-to-face collaborative learning indicated that the learning by means of web-based communication resulted in more affective reactivity when compared to learning in a face-to-face situation. The results imply that the students in the web-based learning group experienced more intense emotions than the students in the face-to-face learning group.The interpretations of this result are that the lack of means for expressing emotional reactions and perceiving others’ emotions increased the affectivity in the web-based learning groups. Such increased affective reactivity could, for example, debilitate individual’s learning performance, especially in complex learning tasks. Therefore, it is recommended that in the future more studies should be focused on the possibilities to express emotions in a text-based web environment to ensure better means for communicating emotions, and subsequently, possibly decrease the high level of affectivity. However, we do not yet know whether the use of means for communicating emotional expressions via the web (for example, “smileys” or “emoticons”) would be beneficial or disadvantageous in formal learning situations. Therefore, future studies should also focus on assessing how the use of such symbols as a means for expressing emotions in a text-based web environment would affect students’ and teachers’ behaviour and emotional state in web-based learning environments.Siirretty Doriast

    The eye contact effect: mechanisms and development

    Get PDF
    The ‘eye contact effect’ is the phenomenon that perceived eye contact with another human face modulates certain aspects of the concurrent and/or immediately following cognitive processing. In addition, functional imaging studies in adults have revealed that eye contact can modulate activity in structures in the social brain network, and developmental studies show evidence for preferential orienting towards, and processing of, faces with direct gaze from early in life. We review different theories of the eye contact effect and advance a ‘fast-track modulator’ model. Specifically, we hypothesize that perceived eye contact is initially detected by a subcortical route, which then modulates the activation of the social brain as it processes the accompanying detailed sensory information

    Why I tense up when you watch me: inferior parietal cortex mediates an audience’s influence on motor performance

    Get PDF
    The presence of an evaluative audience can alter skilled motor performance through changes in force output. To investigate how this is mediated within the brain, we emulated real-time social monitoring of participants’ performance of a fine grip task during functional magnetic resonance neuroimaging. We observed an increase in force output during social evaluation that was accompanied by focal reductions in activity within bilateral inferior parietal cortex. Moreover, deactivation of the left inferior parietal cortex predicted both inter- and intra-individual differences in socially-induced change in grip force. Social evaluation also enhanced activation within the posterior superior temporal sulcus, which conveys visual information about others’ actions to the inferior parietal cortex. Interestingly, functional connectivity between these two regions was attenuated by social evaluation. Our data suggest that social evaluation can vary force output through the altered engagement of inferior parietal cortex; a region implicated in sensorimotor integration necessary for object manipulation, and a component of the action-observation network which integrates and facilitates performance of observed actions. Social-evaluative situations may induce high-level representational incoherence between one’s own intentioned action and the perceived intention of others which, by uncoupling the dynamics of sensorimotor facilitation, could ultimately perturbe motor output
    corecore