1,258 research outputs found
Dominance attributions following damage to the ventromedial prefrontal cortex
Damage to the human ventromedial prefrontal cortex (VM) can result in dramatic and maladaptive changes in social behavior despite preservation of most other cognitive abilities. One important aspect of social cognition is the ability to detect social dominance, a process of attributing from particular social signals another person's relative standing in the social world. To test the role of the VM in making attributions of social dominance, we designed two experiments: one requiring dominance judgments from static pictures of faces, the second requiring dominance judgments from film clips. We tested three demographically matched groups of subjects: subjects with focal lesions in the VM (n=15), brain-damaged comparison subjects with lesions excluding the VM (n=11), and a reference group of normal individuals with no history of neurological disease (n=32). Contrary to our expectation, we found that subjects with VM lesions gave dominance judgments on both tasks that did not differ significantly from those given by the other groups. Despite their grossly normal performance, however, subjects with VM lesions showed more subtle impairments specifically when judging static faces: They were less discriminative in their dominance judgments, and did not appear to make normal use of gender and age of the faces in forming their judgments. The findings suggest that, in the laboratory tasks we used, damage to the VM does not necessarily impair judgments of social dominance, although it appears to result in alterations in strategy that might translate into behavioral impairments in real life
Panic Anxiety in Humans with Bilateral Amygdala Lesions: Pharmacological Induction via Cardiorespiratory Interoceptive Pathways
We previously demonstrated that carbon dioxide inhalation could induce panic anxiety in a group of rare lesion patients with focal bilateral amygdala damage. To further elucidate the amygdala-independent mechanisms leading to aversive emotional experiences, we retested two of these patients (B.G. and A.M.) to examine whether triggering palpitations and dyspnea via stimulation of non-chemosensory interoceptive channels would be sufficient to elicit panic anxiety. Participants rated their affective and sensory experiences following bolus infusions of either isoproterenol, a rapidly acting peripheral β-adrenergic agonist akin to adrenaline, or saline. Infusions were administered during two separate conditions: a panic induction and an assessment of cardiorespiratory interoception. Isoproterenol infusions induced anxiety in both patients, and full-blown panic in one (patient B.G.). Although both patients demonstrated signs of diminished awareness for cardiac sensation, patient A.M., who did not panic, reported a complete lack of awareness for dyspnea, suggestive of impaired respiratory interoception. These findings indicate that the amygdala may play a role in dynamically detecting changes in cardiorespiratory sensation. The induction of panic anxiety provides further evidence that the amygdala is not required for the conscious experience of fear induced via interoceptive sensory channels
The gray matter volume of the amygdala is correlated with the perception of melodic intervals: a voxel-based morphometry study
Music is not simply a series of organized pitches, rhythms, and timbres, it is capable of evoking emotions. In the present study, voxel-based morphometry (VBM) was employed to explore the neural basis that may link music to emotion. To do this, we identified the neuroanatomical correlates of the ability to extract pitch interval size in a music segment (i.e., interval perception) in a large population of healthy young adults (N = 264). Behaviorally, we found that interval perception was correlated with daily emotional experiences, indicating the intrinsic link between music and emotion. Neurally, and as expected, we found that interval perception was positively correlated with the gray matter volume (GMV) of the bilateral temporal cortex. More important, a larger GMV of the bilateral amygdala was associated with better interval perception, suggesting that the amygdala, which is the neural substrate of emotional processing, is also involved in music processing. In sum, our study provides one of first neuroanatomical evidence on the association between the amygdala and music, which contributes to our understanding of exactly how music evokes emotional responses
The role of spatial frequency information for ERP components sensitive to faces and emotional facial expression
To investigate the impact of spatial frequency on emotional facial expression analysis, ERPs were recorded in response to low spatial frequency (LSF), high spatial frequency (HSF), and unfiltered broad spatial frequency (BSF) faces with fearful or neutral expressions, houses, and chairs. In line with previous findings, BSF fearful facial expressions elicited a greater frontal positivity than BSF neutral facial expressions, starting at about 150 ms after stimulus onset. In contrast, this emotional expression effect was absent for HSF and LSF faces. Given that some brain regions involved in emotion processing, such as amygdala and connected structures, are selectively tuned to LSF visual inputs, these data suggest that ERP effects of emotional facial expression do not directly reflect activity in these regions. It is argued that higher order neocortical brain systems are involved in the generation of emotion-specific waveform modulations. The face-sensitive N170 component was neither affected by emotional facial expression nor by spatial frequency information
Emotion classification in Parkinson's disease by higher-order spectra and power spectrum features using EEG signals: A comparative study
Deficits in the ability to process emotions characterize several neuropsychiatric disorders and are traits of Parkinson's disease (PD), and there is need for a method of quantifying emotion, which is currently performed by clinical diagnosis. Electroencephalogram (EEG) signals, being an activity of central nervous system (CNS), can reflect the underlying true emotional state of a person. This study applied machine-learning algorithms to categorize EEG emotional states in PD patients that would classify six basic emotions (happiness and sadness, fear, anger, surprise and disgust) in comparison with healthy controls (HC). Emotional EEG data were recorded from 20 PD patients and 20 healthy age-, education level- and sex-matched controls using multimodal (audio-visual) stimuli. The use of nonlinear features motivated by the higher-order spectra (HOS) has been reported to be a promising approach to classify the emotional states. In this work, we made the comparative study of the performance of k-nearest neighbor (kNN) and support vector machine (SVM) classifiers using the features derived from HOS and from the power spectrum. Analysis of variance (ANOVA) showed that power spectrum and HOS based features were statistically significant among the six emotional states (p < 0.0001). Classification results shows that using the selected HOS based features instead of power spectrum based features provided comparatively better accuracy for all the six classes with an overall accuracy of 70.10% ± 2.83% and 77.29% ± 1.73% for PD patients and HC in beta (13-30 Hz) band using SVM classifier. Besides, PD patients achieved less accuracy in the processing of negative emotions (sadness, fear, anger and disgust) than in processing of positive emotions (happiness, surprise) compared with HC. These results demonstrate the effectiveness of applying machine learning techniques to the classification of emotional states in PD patients in a user independent manner using EEG signals. The accuracy of the system can be improved by investigating the other HOS based features. This study might lead to a practical system for noninvasive assessment of the emotional impairments associated with neurological disorders
Role of quantum coherence in chromophoric energy transport
The role of quantum coherence and the environment in the dynamics of
excitation energy transfer is not fully understood. In this work, we introduce
the concept of dynamical contributions of various physical processes to the
energy transfer efficiency. We develop two complementary approaches, based on a
Green's function method and energy transfer susceptibilities, and quantify the
importance of the Hamiltonian evolution, phonon-induced decoherence, and
spatial relaxation pathways. We investigate the Fenna-Matthews-Olson protein
complex, where we find a contribution of coherent dynamics of about 10% and of
relaxation of 80%.Comment: 5 pages, 3 figures, included static disorder, correlated environmen
A double-edged sword: the merits and the policy implications of Google Translate in higher education
Machine translation, specifically Google Translate is freely available on a number of devices, and is improving in its ability to provide grammatically accurate translations. This development has the potential to provoke a major transformation in the internationalisation process at universities, since students may be, in the future, able to use technology to circumvent traditional language learning processes. While this is a potentially empowering move that may facilitate academic exchange and the diversification of the learner and researcher community at an international level, it is also a potentially problematic issue in two main respects. Firstly, the technology is at present unable to align to the socio-linguistic aspects of university level writing and may be misunderstood as a remedy to lack of writer language proficiency – a role it is not able to fulfil. Secondly, it introduces a new dimension to the production of academic work that may clash with Higher Education policy and, thus, requires legislation, in particular in light issues such as plagiarism and academic misconduct. This paper considers these issues against the background of English as a Global Lingua Franca, and argues two points. First of these is that HEIs need to develop an understanding and code of practice for the use of this technology. Secondly, three strands of potential future research will be presente
Emotional arousal in agenesis of the corpus callosum
While the processing of verbal and psychophysiological indices of emotional arousal have been investigated extensively in relation to the left and right cerebral hemispheres, it remains poorly understood how both hemispheres normally function together to generate emotional responses to stimuli. Drawing on a unique sample of nine high-functioning subjects with complete agenesis of the corpus callosum (AgCC), we investigated this issue using standardized emotional visual stimuli. Compared to healthy controls, subjects with AgCC showed a larger variance in their cognitive ratings of valence and arousal, and an insensitivity to the emotion category of the stimuli, especially for negatively-valenced stimuli, and especially for their arousal. Despite their impaired cognitive ratings of arousal, some subjects with AgCC showed large skin-conductance responses, and in general skin-conductance responses discriminated emotion categories and correlated with stimulus arousal ratings. We suggest that largely intact right hemisphere mechanisms can support psychophysiological emotional responses, but that the lack of interhemispheric communication between the hemispheres, perhaps together with dysfunction of the anterior cingulate cortex, interferes with normal verbal ratings of arousal, a mechanism in line with some models of alexithymia
Dynamics of trimming the content of face representations for categorization in the brain
To understand visual cognition, it is imperative to determine when, how and with what information the human brain categorizes the visual input. Visual categorization consistently involves at least an early and a late stage: the occipito-temporal N170 event related potential related to stimulus encoding and the parietal P300 involved in perceptual decisions. Here we sought to understand how the brain globally transforms its representations of face categories from their early encoding to the later decision stage over the 400 ms time window encompassing the N170 and P300 brain events. We applied classification image techniques to the behavioral and electroencephalographic data of three observers who categorized seven facial expressions of emotion and report two main findings: (1) Over the 400 ms time course, processing of facial features initially spreads bilaterally across the left and right occipito-temporal regions to dynamically converge onto the centro-parietal region; (2) Concurrently, information processing gradually shifts from encoding common face features across all spatial scales (e.g. the eyes) to representing only the finer scales of the diagnostic features that are richer in useful information for behavior (e.g. the wide opened eyes in 'fear'; the detailed mouth in 'happy'). Our findings suggest that the brain refines its diagnostic representations of visual categories over the first 400 ms of processing by trimming a thorough encoding of features over the N170, to leave only the detailed information important for perceptual decisions over the P300
- …
