277 research outputs found

    Unseen Affective Faces Influence Person Perception Judgments in Schizophrenia.

    Get PDF
    To demonstrate the influence of unconscious affective processing on consciously processed information among people with and without schizophrenia, we used a continuous flash suppression (CFS) paradigm to examine whether early and rapid processing of affective information influences first impressions of structurally neutral faces. People with and without schizophrenia rated visible neutral faces as more or less trustworthy, warm, and competent when paired with unseen smiling or scowling faces compared to when paired with unseen neutral faces. Yet, people with schizophrenia also exhibited a deficit in explicit affect perception. These findings indicate that early processing of affective information is intact in schizophrenia but the integration of this information with semantic contexts is problematic. Furthermore, people with schizophrenia who were more influenced by smiling faces presented outside awareness reported experiencing more anticipatory pleasure, suggesting that the ability to rapidly process affective information is important for anticipation of future pleasurable events

    Situating emotional experience

    Get PDF
    Psychological construction approaches to emotion suggest that emotional experience is situated and dynamic. Fear, for example, is typically studied in a physical danger context (e.g., threatening snake), but in the real world, it often occurs in social contexts, especially those involving social evaluation (e.g., public speaking). Understanding situated emotional experience is critical because adaptive responding is guided by situational context (e.g., inferring the intention of another in a social evaluation situation vs. monitoring the environment in a physical danger situation). In an fMRI study, we assessed situated emotional experience using a newly developed paradigm in which participants vividly imagine different scenarios from a first-person perspective, in this case scenarios involving either social evaluation or physical danger. We hypothesized that distributed neural patterns would underlie immersion in social evaluation and physical danger situations, with shared activity patterns across both situations in multiple sensory modalities and in circuitry involved in integrating salient sensory information, and with unique activity patterns for each situation type in coordinated large-scale networks that reflect situated responding. More specifically, we predicted that networks underlying the social inference and mentalizing involved in responding to a social threat (in regions that make up the “default mode” network) would be reliably more active during social evaluation situations. In contrast, networks underlying the visuospatial attention and action planning involved in responding to a physical threat would be reliably more active during physical danger situations. The results supported these hypotheses. In line with emerging psychological construction approaches, the findings suggest that coordinated brain networks offer a systematic way to interpret the distributed patterns that underlie the diverse situational contexts characterizing emotional life

    Primary interoceptive cortex activity during simulated experiences of the body

    Get PDF
    Studies of the classic exteroceptive sensory systems (e.g., vision, touch) consistently demonstrate that vividly imagining a sensory experience of the world – simulating it – is associated with increased activity in the corresponding primary sensory cortex. We hypothesized, analogously, that simulating internal bodily sensations would be associated with increased neural activity in primary interoceptive cortex. An immersive, language-based mental imagery paradigm was used to test this hypothesis (e.g., imagine your heart pounding during a roller coaster ride, your face drenched in sweat during a workout). During two neuroimaging experiments, participants listened to vividly described situations and imagined “being there” in each scenario. In Study 1, we observed significantly heightened activity in primary interoceptive cortex (of dorsal posterior insula) during imagined experiences involving vivid internal sensations. This effect was specific to interoceptive simulation: it was not observed during a separate affect focus condition in Study 1, nor during an independent Study 2 that did not involve detailed simulation of internal sensations (instead involving simulation of other sensory experiences). These findings underscore the large-scale predictive architecture of the brain and reveal that words can be powerful drivers of bodily experiences

    Micro-Valences: Perceiving Affective Valence in Everyday Objects

    Get PDF
    Perceiving the affective valence of objects influences how we think about and react to the world around us. Conversely, the speed and quality with which we visually recognize objects in a visual scene can vary dramatically depending on that scene’s affective content. Although typical visual scenes contain mostly “everyday” objects, the affect perception in visual objects has been studied using somewhat atypical stimuli with strong affective valences (e.g., guns or roses). Here we explore whether affective valence must be strong or overt to exert an effect on our visual perception. We conclude that everyday objects carry subtle affective valences – “micro-valences” – which are intrinsic to their perceptual representation

    A functional architecture of the human brain: emerging insights from the science of emotion

    Get PDF
    The ‘faculty psychology’ approach to the mind, which attempts to explain mental function in terms of categories that reflect modular ‘faculties’, such as emotions, cognitions, and perceptions, has dominated research into the mind and its physical correlates. In this paper, we argue that brain organization does not respect the commonsense categories belonging to the faculty psychology approach. We review recent research from the science of emotion demonstrating that the human brain contains broadly distributed functional networks that can each be re-described as basic psychological operations that interact to produce a range of mental states, including, but not limited to, anger, sadness, fear, disgust, and so on. When compared to the faculty psychology approach, this ‘constructionist’ approach provides an alternative functional architecture to guide the design and interpretation of experiments in cognitive neuroscience

    Amygdala and fusiform gyrus temporal dynamics: Responses to negative facial expressions

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The amygdala habituates in response to repeated human facial expressions; however, it is unclear whether this brain region habituates to schematic faces (i.e., simple line drawings or caricatures of faces). Using an fMRI block design, 16 healthy participants passively viewed repeated presentations of schematic and human neutral and negative facial expressions. Percent signal changes within anatomic regions-of-interest (amygdala and fusiform gyrus) were calculated to examine the temporal dynamics of neural response and any response differences based on face type.</p> <p>Results</p> <p>The amygdala and fusiform gyrus had a within-run "U" response pattern of activity to facial expression blocks. The initial block within each run elicited the greatest activation (relative to baseline) and the final block elicited greater activation than the preceding block. No significant differences between schematic and human faces were detected in the amygdala or fusiform gyrus.</p> <p>Conclusion</p> <p>The "U" pattern of response in the amygdala and fusiform gyrus to facial expressions suggests an initial orienting, habituation, and activation recovery in these regions. Furthermore, this study is the first to directly compare brain responses to schematic and human facial expressions, and the similarity in brain responses suggest that schematic faces may be useful in studying amygdala activation.</p

    When Words Hurt: Affective Word Use in Daily News Coverage Impacts Mental Health

    Get PDF
    Media exposure influences mental health symptomology in response to salient aversive events, like terrorist attacks, but little has been done to explore the impact of news coverage that varies more subtly in affective content. Here, we utilized an existing data set in which participants self-reported physical symptoms, depressive symptoms, and anxiety symptoms, and completed a potentiated startle task assessing their physiological reactivity to aversive stimuli at three time points (waves) over a 9-month period. Using a computational linguistics approach, we then calculated an average ratio of words with positive vs. negative affective connotations for only articles from news sources to which each participant self-reported being exposed over the prior 2 weeks at each wave of data collection. As hypothesized, individuals exposed to news coverage with more negative affective tone over the prior 2 weeks reported significantly greater physical and depressive symptoms, and had significantly greater physiological reactivity to aversive stimuli

    Emotional expressions reconsidered: challenges to inferring emotion from human facial movements

    Get PDF
    It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require
    corecore