59 research outputs found
Human Amygdala in Sensory and Attentional Unawareness: Neural Pathways and Behavioural Outcomes
One of the neural structures more often implicated in the processing of emotional signals in the absence of visual awareness is the amygdala. In this chapter, we review current evidence from human neuroscience in healthy and brain-damaged patients on the role of amygdala during non-conscious (visual) perception of emotional stimuli. Nevertheless, there is as of yet no consensus on the limits and conditions that affect the extent of amygdala’s response without focused attention or awareness. We propose to distinguish between attentional unawareness, a condition wherein the stimulus is potentially accessible to enter visual awareness but fails to do so because attention is diverted, and sensory unawareness, in which the stimulus fails to enter awareness because its normal processing in the visual cortex is suppressed. Within this conceptual framework, some of the apparently contradictory findings seem to gain new coherence and converge on the role of the amygdala in supporting different types of non-conscious emotion processing. Amygdala responses in the absence of awareness are linked to different functional mechanisms and are driven by more complex neural networks than commonly assumed. Acknowledging this complexity can be helpful to foster new studies on amygdala functions without awareness and their impact on human behaviour
Amygdala response to emotional stimuli without awareness: Facts and interpretations
Over the past two decades, evidence has accumulated that the human amygdala exerts some of its functions also when the observer is not aware of the content, or even presence, of the triggering emotional stimulus. Nevertheless, there is as of yet no consensus on the limits and conditions that affect the extent of amygdala\u2019s response without focused attention or awareness. Here we review past and recent studies on this subject, examining neuroimaging literature on healthy participants as well as brain damaged patients, and we comment on their strengths and limits. We propose a theoretical distinction between processes involved in attentional unawareness, wherein
the stimulus is potentially accessible to enter visual awareness but fails to do so because attention is diverted, and in sensory unawareness, wherein the stimulus fails to enter awareness because its normal processing in the visual cortex is suppressed. We argue this distinction, along with data sampling amygdala responses with high temporal resolution, helps to appreciate the multiplicity of functional and anatomical mechanisms centered on the amygdala and supporting its role in non-conscious emotion processing.
Separate, but interacting, networks relay visual information to the amygdala exploiting different computational properties of subcortical and cortical routes, thereby supporting amygdala functions at different stages of emotion processing. This view reconciles some
apparent contradictions in the literature, as well as seemingly contrasting proposals, such as the dual stage and the dual route model. We conclude that evidence in favor of the amygdala response without awareness is solid, albeit this response originates from
different functional mechanisms and is driven by more complex neural networks than commonly assumed. Acknowledging the complexity of such mechanisms can foster new insights on the varieties of amygdala functions without awareness and their impact on human behavior
The neuroethology of spontaneous mimicry and emotional contagion in human and non-human animals
Spontaneous mimicry appears fundamental to emotional perception and contagion, especially when it involves facial emotional expressions. Here we cover recent evidence on spontaneous mimicry from ethology, psychology and neuroscience, in non-human and human animals. We first consider how mimicry unfolds in non-human animals (particularly primates) and how it relates to emotional contagion. We focus on two forms of mimicry-related phenomena: facial mimicry and yawn contagion, which are largely conserved across mammals and useful to draw evolutionary scenarios. Next, we expand on the psychological evidence from humans that bears on current theoretical debates and also informs non-human animal research. Finally, we cover the neural bases of facial mimicry and yawn contagion. We move beyond the perception/expression/experience trichotomy and from the correlational to the causal evidence that links facial mimicry to emotional contagion by presenting evidence from neuroimaging, direct manipulation, neuro-stimulation and lesion studies. In conclusion, this review proposes a bottom-up, multidisciplinary approach to the study of spontaneous mimicry that accounts for the evolutionary continuity linking non-human and human animals
A subcortical network for implicit visuo-spatial attention:Implications for Parkinson's disease
Recent studies in humans and animal models suggest a primary role of the basal ganglia in the extraction of stimulus-value regularities, then exploited to orient attentional shift and build up sensorimotor memories. The tail of the caudate and the posterior putamen both receive early visual input from the superficial layers of the superior colliculus, thus forming a closed-loop. We portend that the functional value of this circuit is to manage the selection of visual stimuli in a rapid and automatic way, once sensory-motor associations are formed and stored in the posterior striatum. In Parkinson's Disease, the nigrostriatal dopamine depletion starts and tends to be more pronounced in the posterior putamen. Thus, at least some aspect of the visuospatial attention deficits observed since the early stages of the disease could be the behavioral consequences of a cognitive system that has lost the ability to translate high-level processing in stable sensorimotor memories. (C) 2021 The Authors. Published by Elsevier Ltd
From affective blindsight to emotional consciousness
a b s t r a c t Following destruction or denervation of the primary visual cortex (V1) cortical blindness ensues. Affective blindsight refers to the uncanny ability of such patients to respond correctly, or above chance level, to visual emotional expressions presented to their blind fields. Fifteen years after its original discovery, affective blindsight still fascinates neuroscientists and philosophers alike, as it offers a unique window on the vestigial properties of our visual system that, though present in the intact brain, tend to be unnoticed or even actively inhibited by conscious processes. Here we review available studies on affective blindsight with the intent to clarify its functional properties, neural bases and theoretical implications. Evidence converges on the role of subcortical structures of old evolutionary origin such as the superior colliculus, the pulvinar and the amygdala in mediating affective blindsight and nonconscious perception of emotions. We conclude that approaching consciousness, and its absence, from the vantage point of emotion processing may uncover important relations between the two phenomena, as consciousness may have evolved as an evolutionary specialization to interact with others and become aware of their social and emotional expressions
A deep neural network model of the primate superior colliculus for emotion recognition
Although sensory processing is pivotal to nearly every theory of emotion, the evaluation of the visual input as ‘emotional’ (e.g. a smile as signalling happiness) has been traditionally assumed to take place in supramodal ‘limbic’ brain regions. Accordingly, subcortical structures of ancient evolutionary origin that receive direct input from the retina, such as the superior colliculus (SC), are traditionally conceptualized as passive relay centres. However, mounting evidence suggests that the SC is endowed with the necessary infrastructure and computational capabilities for the innate recognition and initial categorization of emotionally salient features from retinal information. Here, we built a neurobiologically inspired convolutional deep neural network (DNN) model that approximates physiological, anatomical and connectional properties of the retino-collicular circuit. This enabled us to characterize and isolate the initial computations and discriminations that the DNN model of the SC can perform on facial expressions, based uniquely on the information it directly receives from the virtual retina. Trained to discriminate facial expressions of basic emotions, our model matches human error patterns and above chance, yet suboptimal, classification accuracy analogous to that reported in patients with V1 damage, who rely on retino-collicular pathways for non-conscious vision of emotional attributes. When presented with gratings of different spatial frequencies and orientations never ‘seen’ before, the SC model exhibits spontaneous tuning to low spatial frequencies and reduced orientation discrimination, as can be expected from the prevalence of the magnocellular (M) over parvocellular (P) projections. Likewise, face manipulation that biases processing towards the M or P pathway affects expression recognition in the SC model accordingly, an effect that dovetails with variations of activity in the human SC purposely measured with ultra-high field functional magnetic resonance imaging. Lastly, the DNN generates saliency maps and extracts visual features, demonstrating that certain face parts, like the mouth or the eyes, provide higher discriminative information than other parts as a function of emotional expressions like happiness and sadness. The present findings support the contention that the SC possesses the necessary infrastructure to analyse the visual features that define facial emotional stimuli also without additional processing stages in the visual cortex or in ‘limbic’ areas
- …