310 research outputs found

    Neurophysiological markers of phrasal verb processing: evidence from L1 and L2 speakers

    Get PDF
    Bilingual Figurative Language Processing is a timely book that provides a much-needed bilingual perspective to the broad field of figurative language. This is the first book of its kind to address how bilinguals acquire, store, and process figurative language, such as idiomatic expressions (e.g., kick the bucket), metaphors (e.g., lawyers are sharks), and irony, and how these tropes might interact in real time across the bilingual's two languages. This volume offers the reader and the bilingual student an overview of the major strands of research, both theoretical and empirical, currently being undertaken in this field of inquiry. At the same time, Bilingual Figurative Language Processing provides readers and undergraduate and graduate students with the opportunity to acquire hands-on experience in the development of psycholinguistic experiments in bilingual figurative language. Each chapter includes a section on suggested student research projects. Selected chapters provide detailed procedures on how to design and develop psycholinguistic experiments

    Emotional Speech Perception Unfolding in Time: The Role of the Basal Ganglia

    Get PDF
    The basal ganglia (BG) have repeatedly been linked to emotional speech processing in studies involving patients with neurodegenerative and structural changes of the BG. However, the majority of previous studies did not consider that (i) emotional speech processing entails multiple processing steps, and the possibility that (ii) the BG may engage in one rather than the other of these processing steps. In the present study we investigate three different stages of emotional speech processing (emotional salience detection, meaning-related processing, and identification) in the same patient group to verify whether lesions to the BG affect these stages in a qualitatively different manner. Specifically, we explore early implicit emotional speech processing (probe verification) in an ERP experiment followed by an explicit behavioral emotional recognition task. In both experiments, participants listened to emotional sentences expressing one of four emotions (anger, fear, disgust, happiness) or neutral sentences. In line with previous evidence patients and healthy controls show differentiation of emotional and neutral sentences in the P200 component (emotional salience detection) and a following negative-going brain wave (meaning-related processing). However, the behavioral recognition (identification stage) of emotional sentences was impaired in BG patients, but not in healthy controls. The current data provide further support that the BG are involved in late, explicit rather than early emotional speech processing stages

    Species-Specific and Distance-Dependent Dispersive Behaviour of Forisomes in Different Legume Species

    Get PDF
    Forisomes are giant fusiform protein complexes composed of sieve element occlusion (SEO) protein monomers, exclusively found in sieve elements (SEs) of legumes. Forisomes block the phloem mass flow by a Ca2+-induced conformational change (swelling and rounding). We studied the forisome reactivity in four different legume species—Medicago sativa, Pisum sativum, Trifolium pratense and Vicia faba. Depending on the species, we found direct relationships between SE diameter, forisome surface area and distance from the leaf tip, all indicative of a developmentally tuned regulation of SE diameter and forisome size. Heat-induced forisome dispersion occurred later with increasing distance from the stimulus site. T. pratense and V. faba dispersion occurred faster for forisomes with a smaller surface area. Near the stimulus site, electro potential waves (EPWs)—overlapping action (APs), and variation potentials (VPs)—were linked with high full-dispersion rates of forisomes. Distance-associated reduction of forisome reactivity was assigned to the disintegration of EPWs into APs, VPs and system potentials (SPs). Overall, APs and SPs alone were unable to induce forisome dispersion and only VPs above a critical threshold were capable of inducing forisome reactions

    Alternative Splicing and Extensive RNA Editing of Human TPH2 Transcripts

    Get PDF
    Brain serotonin (5-HT) neurotransmission plays a key role in the regulation of mood and has been implicated in a variety of neuropsychiatric conditions. Tryptophan hydroxylase (TPH) is the rate-limiting enzyme in the biosynthesis of 5-HT. Recently, we discovered a second TPH isoform (TPH2) in vertebrates, including man, which is predominantly expressed in brain, while the previously known TPH isoform (TPH1) is primarly a non-neuronal enzyme. Overwhelming evidence now points to TPH2 as a candidate gene for 5-HT-related psychiatric disorders. To assess the role of TPH2 gene variability in the etiology of psychiatric diseases we performed cDNA sequence analysis of TPH2 transcripts from human post mortem amygdala samples obtained from individuals with psychiatric disorders (drug abuse, schizophrenia, suicide) and controls. Here we show that TPH2 exists in two alternatively spliced variants in the coding region, denoted TPH2a and TPH2b. Moreover, we found evidence that the pre- mRNAs of both splice variants are dynamically RNA-edited in a mutually exclusive manner. Kinetic studies with cell lines expressing recombinant TPH2 variants revealed a higher activity of the novel TPH2B protein compared with the previously known TPH2A, whereas RNA editing was shown to inhibit the enzymatic activity of both TPH2 splice variants. Therefore, our results strongly suggest a complex fine-tuning of central nervous system 5-HT biosynthesis by TPH2 alternative splicing and RNA editing. Finally, we present molecular and large-scale linkage data evidencing that deregulated alternative splicing and RNA editing is involved in the etiology of psychiatric diseases, such as suicidal behaviour

    Recognizing Emotions in a Foreign Language

    Get PDF
    Expressions of basic emotions (joy, sadness, anger, fear, disgust) can be recognized pan-culturally from the face and it is assumed that these emotions can be recognized from a speaker's voice, regardless of an individual's culture or linguistic ability. Here, we compared how monolingual speakers of Argentine Spanish recognize basic emotions from pseudo-utterances ("nonsense speech") produced in their native language and in three foreign languages (English, German, Arabic). Results indicated that vocal expressions of basic emotions could be decoded in each language condition at accuracy levels exceeding chance, although Spanish listeners performed significantly better overall in their native language ("in-group advantage"). Our findings argue that the ability to understand vocally-expressed emotions in speech is partly independent of linguistic ability and involves universal principles, although this ability is also shaped by linguistic and cultural variables

    How Psychological Stress Affects Emotional Prosody

    Get PDF
    We explored how experimentally induced psychological stress affects the production and recognition of vocal emotions. In Study 1a, we demonstrate that sentences spoken by stressed speakers are judged by naive listeners as sounding more stressed than sentences uttered by non-stressed speakers. In Study 1b, negative emotions produced by stressed speakers are generally less well recognized than the same emotions produced by non-stressed speakers. Multiple mediation analyses suggest this poorer recognition of negative stimuli was due to a mismatch between the variation of volume voiced by speakers and the range of volume expected by listeners. Together, this suggests that the stress level of the speaker affects judgments made by the receiver. In Study 2, we demonstrate that participants who were induced with a feeling of stress before carrying out an emotional prosody recognition task performed worse than non-stressed participants. Overall, findings suggest detrimental effects of induced stress on interpersonal sensitivity

    Recruitment of Language-, Emotion- and Speech-Timing Associated Brain Regions for Expressing Emotional Prosody: Investigation of Functional Neuroanatomy with fMRI

    Get PDF
    We aimed to progress understanding of prosodic emotion expression by establishing brain regions active when expressing specific emotions, those activated irrespective of the target emotion, and those whose activation intensity varied depending on individual performance. BOLD contrast data were acquired whilst participants spoke non-sense words in happy, angry or neutral tones, or performed jaw-movements. Emotion-specific analyses demonstrated that when expressing angry prosody, activated brain regions included the inferior frontal and superior temporal gyri, the insula, and the basal ganglia. When expressing happy prosody, the activated brain regions also included the superior temporal gyrus, insula, and basal ganglia, with additional activation in the anterior cingulate. Conjunction analysis confirmed that the superior temporal gyrus and basal ganglia were activated regardless of the specific emotion concerned. Nevertheless, disjunctive comparisons between the expression of angry and happy prosody established that anterior cingulate activity was significantly higher for angry prosody than for happy prosody production. Degree of inferior frontal gyrus activity correlated with the ability to express the target emotion through prosody. We conclude that expressing prosodic emotions (vs. neutral intonation) requires generic brain regions involved in comprehending numerous aspects of language, emotion-related processes such as experiencing emotions, and in the time-critical integration of speech information

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions
    • …
    corecore