35 research outputs found

    Bifocal emotion regulation through acupoint tapping in fear of flying

    Get PDF
    Very few studies have investigated the neural underpinnings of bifocal-multisensory interventions such as acupoint tapping (tapping) despite their well-documented efficacy. The present study aims to investigate the neural and behavioral responses to tapping during the perception of phobic and generally fear-inducing stimulation in a group of participants with fear of flying. We studied 29 flight-phobic participants who were exposed to phobia-related, fear-inducing and neutral stimulation while undergoing fMRI and a bifocal-multisensory intervention session consisting of tapping plus cognitive restructuring in a within-subject design. During tapping we found an up-regulation of neural activation in the amygdala, and a down-regulation in the hippocampus and temporal pole. These effects were different from automatic emotion regulatory processes which entailed down-regulation in the amygdala, hippocampus, and temporal pole. Mean scores (±SD) on the Fear of Flying scale dropped from 2.51(±0.65) before the intervention to 1.27(±0.68) after the intervention (p <.001). The proportion of participants meeting the criteria for fear of flying also dropped from 89.7 percent before the intervention to 24.0 percent after the intervention (p <.001). Taken together, our results lend support to the effectiveness of tapping as a means of emotion regulation across multiple contexts and add to previous findings of increased amygdala activation during tapping, as opposed to amygdala down-regulation found in other emotion regulation techniques. They expand on previous knowledge by suggesting that tapping might modulate the processing of complex visual scene representations and their binding with visceral emotional reponses, reflected by the down-regulation of activation in the hippocampus and temporal pole. Bifocal emotion regulation was useful in ameliorating aversive reactions to phobic stimuli in people with fear of flying

    Music perception in cochlear implant users: An event-related potential study

    No full text
    Objective : Compare the processing of music-syntactic irregularities and physical oddballs between cochlear implant (CI) users and matched controls. Methods : Musical chord sequences were presented, some of which contained functionally irregular chords, or a chord with an instrumental timbre that deviated from the standard timbre. Results : In both controls and CI users, functionally irregular chords elicited early (around 200 ms) and late (around 500 ms) negative electric brain responses (early right anterior negativity,ERAN and N5). Amplitudes of effects depended on the degree of music-syntactic irregularity in both groups; effects elicited in CI users were distinctly smaller than in controls. Physically deviant chords elicited a timbre- mismatch negativity (MMN) and a P3 in both groups, again with smaller amplitudes in CI users. Conclusions : ERAN and N5 (as well as timbre-MMN and P3), can be elicited in CI users. Although amplitudes of effects were considerably smaller in the CI group, the presence of MMN and ERAN indicates that neural mechanisms of both physical and music- syntactic irregularity-detection were active in this group. q 2004 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserve

    Eläkkeellä ja työssä: tilasto eläkeläisten työnteosta vuosina 2007–2018

    Get PDF
    Tilastojulkaisussa käsitellään eläkeläisten työntekoa työikäisen väestön keskuudessa vuosina 2007–2018. Julkaisu sisältää tietoja työnteon laajuuden ohella myös työssä käyneiden eläkeläisten keskieläkkeistä ja -ansioista

    Neural Circuitry of Emotional and Cognitive Conflict Revealed through Facial Expressions

    Get PDF
    Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality.Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC.These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference

    Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Get PDF
    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions

    Lateral frontal cortex volume reduction in Tourette syndrome revealed by VBM

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Structural changes have been found predominantly in the frontal cortex and in the striatum in children and adolescents with Gilles de la Tourette syndrome (GTS). The influence of comorbid symptomatology is unclear. Here we sought to address the question of gray matter abnormalities in GTS patients <it>with </it>co-morbid obsessive-compulsive disorder (OCD) and/or attention deficit hyperactivity disorder (ADHD) using voxel-based morphometry (VBM) in twenty-nine adult actually unmedicated GTS patients and twenty-five healthy control subjects.</p> <p>Results</p> <p>In GTS we detected a cluster of decreased gray matter volume in the left inferior frontal gyrus (IFG), but no regions demonstrating volume increases. By comparing subgroups of GTS with comorbid ADHD to the subgroup with comorbid OCD, we found a left-sided amygdalar volume increase.</p> <p>Conclusions</p> <p>From our results it is suggested that the left IFG may constitute a common underlying structural correlate of GTS with co-morbid OCD/ADHD. A volume reduction in this brain region that has been previously identified as a key region in OCD and was associated with the active inhibition of attentional processes may reflect the failure to control behavior. Amygdala volume increase is discussed on the background of a linkage of this structure with ADHD symptomatology. Correlations with clinical data revealed gray matter volume changes in specific brain areas that have been described in these conditions each.</p

    Neurocognition of music

    No full text

    Valence-specific conflict moderation in the dorso-medial PFC and the caudate head in emotional speech

    No full text
    Emotional speech comprises of complex multimodal verbal and non-verbal information that allows deducting others’ emotional states or thoughts in social interactions. While the neural correlates of verbal and non-verbal aspects and their interaction in emotional speech have been identified, there is very little evidence on how we perceive and resolve incongruity in emotional speech, and whether such incongruity extends to current concepts of task-specific prediction errors as a consequence of unexpected action outcomes (‘negative surprise’). Here, we explored this possibility while participants listened to congruent and incongruent angry, happy or neutral utterances and categorized the expressed emotions by their verbal (semantic) content. Results reveal valence-specific incongruity effects: negative verbal content expressed in a happy tone of voice increased activation in the dorso-medial prefrontal cortex (dmPFC) extending its role from conflict moderation to appraisal of valence-specific conflict in emotional speech. Conversely, the caudate head bilaterally responded selectively to positive verbal content expressed in an angry tone of voice broadening previous accounts of the caudate head in linguistic control to moderating valence-specific control in emotional speech. Together, these results suggest that control structures of the human brain (dmPFC and subcompartments of the basal ganglia) impact emotional speech differentially when conflict arises
    corecore