1,848 research outputs found
Shifts in Gamma Phase–Amplitude Coupling Frequency from Theta to Alpha Over Posterior Cortex During Visual Tasks
The phase of ongoing theta (4–8 Hz) and alpha (8–12 Hz) electrophysiological oscillations is coupled to high gamma (80–150 Hz) amplitude, which suggests that low-frequency oscillations modulate local cortical activity. While this phase–amplitude coupling (PAC) has been demonstrated in a variety of tasks and cortical regions, it has not been shown whether task demands differentially affect the regional distribution of the preferred low-frequency coupling to high gamma. To address this issue we investigated multiple-rhythm theta/alpha to high gamma PAC in two subjects with implanted subdural electrocorticographic grids. We show that high gamma amplitude couples to the theta and alpha troughs and demonstrate that, during visual tasks, alpha/high gamma coupling preferentially increases in visual cortical regions. These results suggest that low-frequency phase to high-frequency amplitude coupling is modulated by behavioral task and may reflect a mechanism for selection between communicating neuronal networks
Spatiotemporal Dynamics of Word Processing in the Human Brain
We examined the spatiotemporal dynamics of word processing by recording the electrocorticogram (ECoG) from the lateral frontotemporal cortex of neurosurgical patients chronically implanted with subdural electrode grids. Subjects engaged in a target detection task where proper names served as infrequent targets embedded in a stream of task-irrelevant verbs and nonwords. Verbs described actions related to the hand (e.g, throw) or mouth (e.g., blow), while unintelligible nonwords were sounds which matched the verbs in duration, intensity, temporal modulation, and power spectrum. Complex oscillatory dynamics were observed in the delta, theta, alpha, beta, low, and high gamma (HG) bands in response to presentation of all stimulus types. HG activity (80–200 Hz) in the ECoG tracked the spatiotemporal dynamics of word processing and identified a network of cortical structures involved in early word processing. HG was used to determine the relative onset, peak, and offset times of local cortical activation during word processing. Listening to verbs compared to nonwords sequentially activates first the posterior superior temporal gyrus (post-STG), then the middle superior temporal gyrus (mid-STG), followed by the superior temporal sulcus (STS). We also observed strong phase-locking between pairs of electrodes in the theta band, with weaker phase-locking occurring in the delta, alpha, and beta frequency ranges. These results provide details on the first few hundred milliseconds of the spatiotemporal evolution of cortical activity during word processing and provide evidence consistent with the hypothesis that an oscillatory hierarchy coordinates the flow of information between distinct cortical regions during goal-directed behavior
Lesion evidence for a critical role of left posterior but not frontal areas in alpha-beta power decreases during context-driven word production
Acknowledgements The authors are grateful for the patients and their families, as well as for the other volunteer participants for taking part in this study. We would like to thank Donatella Scabini and Brian Curran for patient delineation, Brian Curran, Clay Clayworth and Callum Dewar for lesion reconstruction, Amber Moncrief and Selvi Paulraj for helping design the materials, Paige Mumford and Laura Agee for help with audio recordings, Kristoffer Dahlslätt for invaluable discussions, and the members of the Center for Aphasia and Related Disorders at the VAHCS in Martinez, CA, for neuropsychological testing. Funding This work is supported by grants from the Netherlands Organization for Scientific Research (446‐13‐009 to V.P., 275‐89‐032 to J.R.), the National Institutes of Health (NINDS R37 NS21135 to R.T.K), and by the Nielsen Corporation. Data accessibility This article's supporting data and materials can be obtained upon request.Peer reviewedPostprin
Low attentional engagement makes attention network activity susceptible to emotional interference
The aim of this study was to investigate whether emotion-attention interaction depends on attentional engagement. To investigate emotional modulation of attention network activation, we used a functional MRI paradigm consisting of a visuospatial attention task with either frequent (high-engagement) or infrequent (low-engagement) targets and intermittent emotional or neutral distractors. The attention task recruited a bilateral frontoparietal network with no emotional interference on network activation when the attentional engagement was high. In contrast, when the attentional engagement was low, the unpleasant stimuli interfered with the activation of the frontoparietal attention network, especially in the right hemisphere. This study provides novel evidence for low attentional engagement making attention control network activation susceptible to emotional interference. © 2014 Wolters Kluwer Health | Lippincott Williams & Wilkins.Fil: Exposito, Veronica. Fundación para la Lucha contra las Enfermedades Neurológicas de la Infancia; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad de Tampere; FinlandiaFil: Pickard, Natasha. California State University; Estados UnidosFil: Solbakk, Anne-Kristin. University of Oslo; NoruegaFil: Ogawa, Keith H.. Saint Mary's College Of California; Estados UnidosFil: Knight, Robert T.. California State University; Estados UnidosFil: Hartikainen, Kaisa M.. Universidad de Tampere; Finlandi
Recommended from our members
Amygdala Response to Facial Expressions Reflects Emotional Learning
The functional role of the human amygdala in the evaluation of emotional facial expressions is unclear. Previous animal and human research shows that the amygdala participates in processing positive and negative reinforcement as well as in learning predictive associations between stimuli and subsequent reinforcement. Thus, amygdala response to facial expressions could reflect the processing of primary reinforcement or emotional learning. Here, using functional magnetic resonance imaging, we tested the hypothesis that amygdala response to facial expressions is driven by emotional association learning. We show that the amygdala is more responsive to learning object-emotion associations from happy and fearful facial expressions than it is to the presentation of happy and fearful facial expressions alone. The results provide evidence that the amygdala uses social signals to rapidly and flexibly learn threatening and rewarding associations that ultimately serve to enhance survival.Psycholog
Working memory replay prioritizes weakly attended events
One view of working memory posits that maintaining a series of events requires their sequential and equal mnemonic replay. Another view is that the content of working memory maintenance is prioritized by attention. We decoded the dynamics for retaining a sequence of items using magnetoencephalography, wherein participants encoded sequences of three stimuli depicting a face, a manufactured object, or a natural item and maintained them in working memory for 5000 ms. Memory for sequence position and stimulus details were probed at the end of the maintenance period. Decoding of brain activity revealed that one of the three stimuli dominated maintenance independent of its sequence position or category; and memory was enhanced for the selectively replayed stimulus. Analysis of event-related responses during the encoding of the sequence showed that the selectively replayed stimuli were determined by the degree of attention at encoding. The selectively replayed stimuli had the weakest initial encoding indexed by weaker visual attention signals at encoding. These findings do not rule out sequential mnemonic replay but reveal that attention influences the content of working memory maintenance by prioritizing replay of weakly encoded events. We propose that the prioritization of weakly encoded stimuli protects them from interference during the maintenance period, whereas the more strongly encoded stimuli can be retrieved from long-term memory at the end of the delay period
Recommended from our members
Neural Activity During Social Signal Perception Correlates With Self-reported Empathy
Empathy is an important component of human relationships, yet the neural mechanisms that facilitate empathy are unclear. The broad construct of empathy incorporates both cognitive and affective components. Cognitive empathy includes mentalizing skills such as perspective-taking. Affective empathy consists of the affect produced in response to someone else's emotional state, a process which is facilitated by simulation or “mirroring.” Prior evidence shows that mentalizing tasks engage a neural network which includes the temporoparietal junction, superior temporal sulcus, and medial prefrontal cortex. On the other hand, simulation tasks engage the fronto-parietal mirror neuron system (MNS) which includes the inferior frontal gyrus (IFG) and the somotosensory related cortex (SRC). Here, we tested whether neural activity in these two neural networks was related to self-reports of cognitive and affective empathy in daily life. Participants viewed social scenes in which the shift of direction of attention of a character did or did not change the character's mental and emotional state. As expected, the task robustly activated both mentalizing and MNS networks. We found that when detecting the character's change in mental and emotional state, neural activity in both networks is strongly related to cognitive empathy. Specifically, neural activity in the IFG, SRC, and STS were related to cognitive empathy. Activity in the precentral gyrus was related to affective empathy. The findings suggest that both simulation and mentalizing networks contribute to multiple components of empathy.Psycholog
- …