146 research outputs found

    Recognizing Emotions in a Foreign Language

    Get PDF
    Expressions of basic emotions (joy, sadness, anger, fear, disgust) can be recognized pan-culturally from the face and it is assumed that these emotions can be recognized from a speaker's voice, regardless of an individual's culture or linguistic ability. Here, we compared how monolingual speakers of Argentine Spanish recognize basic emotions from pseudo-utterances ("nonsense speech") produced in their native language and in three foreign languages (English, German, Arabic). Results indicated that vocal expressions of basic emotions could be decoded in each language condition at accuracy levels exceeding chance, although Spanish listeners performed significantly better overall in their native language ("in-group advantage"). Our findings argue that the ability to understand vocally-expressed emotions in speech is partly independent of linguistic ability and involves universal principles, although this ability is also shaped by linguistic and cultural variables

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    It's not what you say but the way that you say it: an fMRI study of differential lexical and non-lexical prosodic pitch processing

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>This study aims to identify the neural substrate involved in prosodic pitch processing. Functional magnetic resonance imaging was used to test the premise that prosody pitch processing is primarily subserved by the right cortical hemisphere.</p> <p>Two experimental paradigms were used, firstly pairs of spoken sentences, where the only variation was a single internal phrase pitch change, and secondly, a matched condition utilizing pitch changes within analogous tone-sequence phrases. This removed the potential confounder of lexical evaluation. fMRI images were obtained using these paradigms.</p> <p>Results</p> <p>Activation was significantly greater within the right frontal and temporal cortices during the tone-sequence stimuli relative to the sentence stimuli.</p> <p>Conclusion</p> <p>This study showed that pitch changes, stripped of lexical information, are mainly processed by the right cerebral hemisphere, whilst the processing of analogous, matched, lexical pitch change is preferentially left sided. These findings, showing hemispherical differentiation of processing based on stimulus complexity, are in accord with a 'task dependent' hypothesis of pitch processing.</p

    Association of a dietary inflammatory index with cardiometabolic, endocrine, liver, renal and bones biomarkers: cross-sectional analysis of the UK Biobank study

    Get PDF
    \ua9 2024 The Author(s)Background and aims: Research into the relationship between an Energy-adjusted Diet-Inflammatory Index (E-DII) and a wider health-related biomarkers profile is limited. Much of the existing evidence centers on traditional metabolic biomarkers in populations with chronic diseases, with scarce data on healthy individuals. Thus, this study aims to investigate the association between an E-DII score and 30 biomarkers spanning metabolic health, endocrine, bone health, liver function, cardiovascular, and renal functions, in healthy individuals. Methods and results: 66,978 healthy UK Biobank participants, the overall mean age was 55.3 (7.9) years were included in this cross-sectional study. E-DII scores, based on 18 food parameters, were categorised as anti-inflammatory (E-DII &lt; -1), neutral (−1 to 1), and pro-inflammatory (&gt;1). Regression analyses, adjusted for confounding factors, were conducted to investigate the association of 30 biomarkers with E-DII. Compared to those with an anti-inflammatory diet, individuals with a pro-inflammatory diet had increased levels of 16 biomarkers, including six cardiometabolic, five liver, and four renal markers. The concentration difference ranged from 0.27 SD for creatinine to 0.03 SD for total cholesterol. Conversely, those on a pro-inflammatory diet had decreased concentrations in six biomarkers, including two for endocrine and cardiometabolic. The association range varied from −0.04 for IGF-1 to −0.23 for SHBG. Conclusion: This study highlighted that a pro-inflammatory diet was associated with an adverse profile of biomarkers linked to cardiometabolic health, endocrine, liver function, and renal health

    Inter-hemispheric EEG coherence analysis in Parkinson's disease : Assessing brain activity during emotion processing

    Get PDF
    Parkinson’s disease (PD) is not only characterized by its prominent motor symptoms but also associated with disturbances in cognitive and emotional functioning. The objective of the present study was to investigate the influence of emotion processing on inter-hemispheric electroencephalography (EEG) coherence in PD. Multimodal emotional stimuli (happiness, sadness, fear, anger, surprise, and disgust) were presented to 20 PD patients and 30 age-, education level-, and gender-matched healthy controls (HC) while EEG was recorded. Inter-hemispheric coherence was computed from seven homologous EEG electrode pairs (AF3–AF4, F7–F8, F3–F4, FC5–FC6, T7–T8, P7–P8, and O1–O2) for delta, theta, alpha, beta, and gamma frequency bands. In addition, subjective ratings were obtained for a representative of emotional stimuli. Interhemispherically, PD patients showed significantly lower coherence in theta, alpha, beta, and gamma frequency bands than HC during emotion processing. No significant changes were found in the delta frequency band coherence. We also found that PD patients were more impaired in recognizing negative emotions (sadness, fear, anger, and disgust) than relatively positive emotions (happiness and surprise). Behaviorally, PD patients did not show impairment in emotion recognition as measured by subjective ratings. These findings suggest that PD patients may have an impairment of inter-hemispheric functional connectivity (i.e., a decline in cortical connectivity) during emotion processing. This study may increase the awareness of EEG emotional response studies in clinical practice to uncover potential neurophysiologic abnormalities

    Emotional Speech Perception Unfolding in Time: The Role of the Basal Ganglia

    Get PDF
    The basal ganglia (BG) have repeatedly been linked to emotional speech processing in studies involving patients with neurodegenerative and structural changes of the BG. However, the majority of previous studies did not consider that (i) emotional speech processing entails multiple processing steps, and the possibility that (ii) the BG may engage in one rather than the other of these processing steps. In the present study we investigate three different stages of emotional speech processing (emotional salience detection, meaning-related processing, and identification) in the same patient group to verify whether lesions to the BG affect these stages in a qualitatively different manner. Specifically, we explore early implicit emotional speech processing (probe verification) in an ERP experiment followed by an explicit behavioral emotional recognition task. In both experiments, participants listened to emotional sentences expressing one of four emotions (anger, fear, disgust, happiness) or neutral sentences. In line with previous evidence patients and healthy controls show differentiation of emotional and neutral sentences in the P200 component (emotional salience detection) and a following negative-going brain wave (meaning-related processing). However, the behavioral recognition (identification stage) of emotional sentences was impaired in BG patients, but not in healthy controls. The current data provide further support that the BG are involved in late, explicit rather than early emotional speech processing stages

    The development of cross-cultural recognition of vocal emotion during childhood and adolescence

    Get PDF
    Humans have an innate set of emotions recognised universally. However, emotion recognition also depends on socio-cultural rules. Although adults recognise vocal emotions universally, they identify emotions more accurately in their native language. We examined developmental trajectories of universal vocal emotion recognition in children. Eighty native English speakers completed a vocal emotion recognition task in their native language (English) and foreign languages (Spanish, Chinese, and Arabic) expressing anger, happiness, sadness, fear, and neutrality. Emotion recognition was compared across 8-to-10, 11-to-13-year-olds, and adults. Measures of behavioural and emotional problems were also taken. Results showed that although emotion recognition was above chance for all languages, native English speaking children were more accurate in recognising vocal emotions in their native language. There was a larger improvement in recognising vocal emotion from the native language during adolescence. Vocal anger recognition did not improve with age for the non-native languages. This is the first study to demonstrate universality of vocal emotion recognition in children whilst supporting an “in-group advantage” for more accurate recognition in the native language. Findings highlight the role of experience in emotion recognition, have implications for child development in modern multicultural societies and address important theoretical questions about the nature of emotions

    The relationship between workers' self-reported changes in health and their attitudes towards a workplace intervention: lessons from smoke-free legislation across the UK hospitality industry

    Get PDF
    Background: The evaluation of smoke-free legislation (SFL) in the UK examined the impacts on exposure to second-hand smoke, workers’ attitudes and changes in respiratory health. Studies that investigate changes in the health of groups of people often use self-reported symptoms. Due to the subjective nature it is of interest to determine whether workers’ attitudes towards the change in their working conditions may be linked to the change in health they report. Methods: Bar workers were recruited before the introduction of the SFL in Scotland and England with the aim of investigating their changes to health, attitudes and exposure as a result of the SFL. They were asked about their attitudes towards SFL and the presence of respiratory and sensory symptoms both before SFL and one year later. Here we examine the possibility of a relationship between initial attitudes and changes in reported symptoms, through the use of regression analyses. Results: There was no difference in the initial attitudes towards SFL between those working in Scotland and England. Bar workers who were educated to a higher level tended to be more positive towards SFL. Attitude towards SFL was not found to be related to change in reported symptoms for bar workers in England (Respiratory, p = 0.755; Sensory, p = 0.910). In Scotland there was suggestion of a relationship with reporting of respiratory symptoms (p = 0.042), where those who were initially more negative to SFL experienced a greater improvement in self-reported health. Conclusions: There was no evidence that workers who were more positive towards SFL reported greater improvements in respiratory and sensory symptoms. This may not be the case in all interventions and we recommend examining subjects’ attitudes towards the proposed intervention when evaluating possible health benefits using self-reported methods. Keywords: ‘Self-Reported Health’, Attitudes, ‘Workplace Intervention’, ‘Public Health Intervention

    How Psychological Stress Affects Emotional Prosody

    Get PDF
    We explored how experimentally induced psychological stress affects the production and recognition of vocal emotions. In Study 1a, we demonstrate that sentences spoken by stressed speakers are judged by naive listeners as sounding more stressed than sentences uttered by non-stressed speakers. In Study 1b, negative emotions produced by stressed speakers are generally less well recognized than the same emotions produced by non-stressed speakers. Multiple mediation analyses suggest this poorer recognition of negative stimuli was due to a mismatch between the variation of volume voiced by speakers and the range of volume expected by listeners. Together, this suggests that the stress level of the speaker affects judgments made by the receiver. In Study 2, we demonstrate that participants who were induced with a feeling of stress before carrying out an emotional prosody recognition task performed worse than non-stressed participants. Overall, findings suggest detrimental effects of induced stress on interpersonal sensitivity

    The Effect of Lateralization of Motor Onset and Emotional Recognition in PD Patients Using EEG

    Get PDF
    The objective of this research was to investigate the relationship between emotion recognition and lateralization of motor onset in Parkinson’s disease (PD) patients using electroencephalogram (EEG) signals. The subject pool consisted of twenty PD patients [ten with predominantly leftsided (LPD) and ten with predominantly rightsided (RPD) motor symptoms] and 20 healthy controls (HC) that were matched for age and gender. Multimodal stimuli were used to evoke simple emotions, such as happiness, sadness, fear, anger, surprise, and disgust. Artifactfree emotion EEG signals were processed using the auto regressive spectral method and then subjected to repeated ANOVA measures. No group differences were observed across behavioral measures? however, a significant reduction in EEG spectral power was observed at alpha, beta and gamma frequency oscillations in LPD, compared to RPD and HC participants, suggesting that LPD patients (inferred righthemisphere pathology) are impaired compared to RPD patients in emotional processing. We also found that PD related emotional processing deficits may be selective to the perception of negative emotions. Previous findings have suggested a hemispheric effect on emotion processing that could be related to emotional response impairment in a subgroup of PD patients. This study may help in clinical practice to uncover potential neurophysiologic abnormalities of emotional changes with respect to PD patient’s motor onset
    corecore