263 research outputs found

    Auditory affective processing requires awareness

    Get PDF
    Recent work has challenged the previously widely accepted belief that affective processing does not require awareness and can be carried out with more limited resources than semantic processing. This debate has focused exclusively on visual perception, even though evidence from both human and animal studies suggests that existence for nonconscious affective processing would be physiologically more feasible in the auditory system. Here we contrast affective and semantic processing of nonverbal emotional vocalizations under different levels of awareness in three experiments, using explicit (two-alternative forced choice masked affective and semantic categorization tasks, Experiments 1 and 2) and implicit (masked affective and semantic priming, Experiment 3) measures. Identical stimuli and design were used in the semantic and affective tasks. Awareness was manipulated by altering stimulus-mask signal-to-noise ratio during continuous auditory masking. Stimulus awareness was measured on each trial using a four-point perceptual awareness scale. In explicit tasks, neither affective nor semantic categorization could be performed in the complete absence of awareness, while both tasks could be performed above chance level when stimuli were consciously perceived. Semantic categorization was faster than affective evaluation. When the stimuli were partially perceived, semantic categorization accuracy exceeded affective evaluation accuracy. In implicit tasks neither affective nor semantic priming occurred in the complete absence of awareness, whereas both affective and semantic priming emerged when participants were aware of the primes. We conclude that auditory semantic processing is faster than affective processing, and that both affective and semantic auditory processing are dependent on awareness

    Carnal pleasures

    Get PDF
    Pleasures are tightly intertwined with the body. Enjoyment derived from sex, feeding and social touch originate from somatosensory and gustatory processing, and pleasant emotions also markedly influence bodily states tied to the reproductive, digestive, skeletomuscular, and endocrine systems. Here, we review recent research on bodily pleasures, focussing on consummatory sensory pleasures. We discuss how different pleasures have distinct sensory inputs and behavioural outputs and review the data on the role of the somatosensory and interoceptive systems in social bonding. Finally, we review the role of gustatory pleasures in feeding and obesity, and discuss the underlying pathophysiological mechanisms. We conclude that different pleasures have distinct inputs and specific outputs, and that their regulatory functions should be understood in light of these specific profiles in addition to generic reward mechanisms.Social decision makin

    Connectivity Analysis Reveals a Cortical Network for Eye Gaze Perception

    Get PDF
    Haxby et al. (Haxby JV, Hoffman EA, Gobbini MI. 2000. The distributed human neural system for face perception. Trends Cogn Sci. 4:223–233.) proposed that eye gaze processing results from an interaction between a “core” face-specific system involved in visual analysis and an “extended” system involved in spatial attention, more generally. However, the full gaze perception network has remained poorly specified. In the context of a functional magnetic resonance imaging study, we used psychophysiological interactions (PPIs) to identify brain regions that showed differential connectivity (correlation) with core face perception structures (posterior superior temporal sulcus [pSTS] and fusiform gyrus [FG]) when viewing gaze shifts relative to control eye movements (opening/closing the eyes). The PPIs identified altered connectivity between the pSTS and MT/V5, intraparietal sulcus, frontal eye fields, superior temporal gyrus (STG), supramarginal gyrus, and middle frontal gyrus (MFG). The FG showed altered connectivity with the same areas of the STG and MFG, demonstrating the contribution of both dorsal and ventral core face areas to gaze perception. We propose that this network provides an interactive system that alerts us to seen changes in other agents’ gaze direction, makes us aware of their altered focus of spatial attention, and prepares a corresponding shift in our own attention

    Eye Contact Judgment Is Influenced by Perceivers' Social Anxiety But Not by Their Affective State

    Get PDF
    Fast and accurate judgment of whether another person is making eye contact or not is crucial for our social interaction. As affective states have been shown to influence social perceptions and judgments, we investigated the influence of observers' own affective states and trait anxiety on their eye contact judgments. In two experiments, participants were required to judge whether animated faces (Experiment 1) and real faces (Experiment 2) with varying gaze angles were looking at them or not. Participants performed the task in pleasant, neutral, and unpleasant odor conditions. The results from two experiments showed that eye contact judgments were not modulated by observers' affective state, yet participants with higher levels of social anxiety accepted a wider range of gaze deviations from the direct gaze as eye contact. We conclude that gaze direction judgments depend on individual differences in affective predispositions, yet they are not amenable to situational affective influences

    Maps of subjective feelings

    Get PDF
    Subjective feelings are a central feature of human life. We defined the organization and determinants of a feeling space involving 100 core feelings that ranged from cognitive and affective processes to somatic sensations and common illnesses. The feeling space was determined by a combination of basic dimension rating, similarity mapping, bodily sensation mapping, and neuroimaging meta-analysis. A total of 1,026 participants took part in online surveys where we assessed (i) for each feeling, the intensity of four hypothesized basic dimensions (mental experience, bodily sensation, emotion, and controllability), (ii) subjectively experienced similarity of the 100 feelings, and (iii) topography of bodily sensations associated with each feeling. Neural similarity between a subset of the feeling states was derived from the NeuroSynth meta-analysis database based on the data from 9,821 brain-imaging studies. All feelings were emotionally valenced and the saliency of bodily sensations correlated with the saliency of mental experiences associated with each feeling. Nonlinear dimensionality reduction revealed five feeling clusters: positive emotions, negative emotions, cognitive processes, somatic states and illnesses, and homeostatic states. Organization of the feeling space was best explained by basic dimensions of emotional valence, mental experiences, and bodily sensations. Subjectively felt similarity of feelings was associated with basic feeling dimensions and the topography of the corresponding bodily sensations. These findings reveal a map of subjective feelings that are categorical, emotional, and embodied.</p

    Decoding brain basis of laughter and crying in natural scenes

    Get PDF
    Laughter and crying are universal signals of prosociality and distress, respectively. Here we investigated the functional brain basis of perceiving laughter and crying using naturalistic functional magnetic resonance imaging (fMRI) approach. We measured haemodynamic brain activity evoked by laughter and crying in three experiments with 100 subjects in each. The subjects i) viewed a 20-minute medley of short video clips, and ii) 30 min of a full-length feature film, and iii) listened to 13.5 min of a radio play that all contained bursts of laughter and crying. Intensity of laughing and crying in the videos and radio play was annotated by independent observes, and the resulting time series were used to predict hemodynamic activity to laughter and crying episodes. Multivariate pattern analysis (MVPA) was used to test for regional selectivity in laughter and crying evoked activations. Laughter induced widespread activity in ventral visual cortex and superior and middle temporal and motor cortices. Crying activated thalamus, cingulate cortex along the anterior-posterior axis, insula and orbitofrontal cortex. Both laughter and crying could be decoded accurately (66–77% depending on the experiment) from the BOLD signal, and the voxels contributing most significantly to classification were in superior temporal cortex. These results suggest that perceiving laughter and crying engage distinct neural networks, whose activity suppresses each other to manage appropriate behavioral responses to others’ bonding and distress signals

    Neurons in the human amygdala encode face identity, but not gaze direction

    Get PDF
    The amygdala is important for face processing, and direction of eye gaze is one of the most socially salient facial signals. Recording from over 200 neurons in the amygdala of neurosurgical patients, we found robust encoding of the identity of neutral-expression faces, but not of their direction of gaze. Processing of gaze direction may rely on a predominantly cortical network rather than the amygdala

    What we observe is biased by what other people tell us: beliefs about the reliability of gaze behavior modulate attentional orienting to gaze cues

    Get PDF
    For effective social interactions with other people, information about the physical environment must be integrated with information about the interaction partner. In order to achieve this, processing of social information is guided by two components: a bottom-up mechanism reflexively triggered by stimulus-related information in the social scene and a top-down mechanism activated by task-related context information. In the present study, we investigated whether these components interact during attentional orienting to gaze direction. In particular, we examined whether the spatial specificity of gaze cueing is modulated by expectations about the reliability of gaze behavior. Expectations were either induced by instruction or could be derived from experience with displayed gaze behavior. Spatially specific cueing effects were observed with highly predictive gaze cues, but also when participants merely believed that actually non-predictive cues were highly predictive. Conversely, cueing effects for the whole gazed-at hemifield were observed with non-predictive gaze cues, and spatially specific cueing effects were attenuated when actually predictive gaze cues were believed to be non-predictive. This pattern indicates that (i) information about cue predictivity gained from sampling gaze behavior across social episodes can be incorporated in the attentional orienting to social cues, and that (ii) beliefs about gaze behavior modulate attentional orienting to gaze direction even when they contradict information available from social episodes
    corecore