547 research outputs found

    Representation of Time-Varying Stimuli by a Network Exhibiting Oscillations on a Faster Time Scale

    Get PDF
    Sensory processing is associated with gamma frequency oscillations (30–80 Hz) in sensory cortices. This raises the question whether gamma oscillations can be directly involved in the representation of time-varying stimuli, including stimuli whose time scale is longer than a gamma cycle. We are interested in the ability of the system to reliably distinguish different stimuli while being robust to stimulus variations such as uniform time-warp. We address this issue with a dynamical model of spiking neurons and study the response to an asymmetric sawtooth input current over a range of shape parameters. These parameters describe how fast the input current rises and falls in time. Our network consists of inhibitory and excitatory populations that are sufficient for generating oscillations in the gamma range. The oscillations period is about one-third of the stimulus duration. Embedded in this network is a subpopulation of excitatory cells that respond to the sawtooth stimulus and a subpopulation of cells that respond to an onset cue. The intrinsic gamma oscillations generate a temporally sparse code for the external stimuli. In this code, an excitatory cell may fire a single spike during a gamma cycle, depending on its tuning properties and on the temporal structure of the specific input; the identity of the stimulus is coded by the list of excitatory cells that fire during each cycle. We quantify the properties of this representation in a series of simulations and show that the sparseness of the code makes it robust to uniform warping of the time scale. We find that resetting of the oscillation phase at stimulus onset is important for a reliable representation of the stimulus and that there is a tradeoff between the resolution of the neural representation of the stimulus and robustness to time-warp. Author Summary Sensory processing of time-varying stimuli, such as speech, is associated with high-frequency oscillatory cortical activity, the functional significance of which is still unknown. One possibility is that the oscillations are part of a stimulus-encoding mechanism. Here, we investigate a computational model of such a mechanism, a spiking neuronal network whose intrinsic oscillations interact with external input (waveforms simulating short speech segments in a single acoustic frequency band) to encode stimuli that extend over a time interval longer than the oscillation's period. The network implements a temporally sparse encoding, whose robustness to time warping and neuronal noise we quantify. To our knowledge, this study is the first to demonstrate that a biophysically plausible model of oscillations occurring in the processing of auditory input may generate a representation of signals that span multiple oscillation cycles.National Science Foundation (DMS-0211505); Burroughs Wellcome Fund; U.S. Air Force Office of Scientific Researc

    Visual Working Memory Load-Related Changes in Neural Activity and Functional Connectivity

    Get PDF
    BACKGROUND: Visual working memory (VWM) helps us store visual information to prepare for subsequent behavior. The neuronal mechanisms for sustaining coherent visual information and the mechanisms for limited VWM capacity have remained uncharacterized. Although numerous studies have utilized behavioral accuracy, neural activity, and connectivity to explore the mechanism of VWM retention, little is known about the load-related changes in functional connectivity for hemi-field VWM retention. METHODOLOGY/PRINCIPAL FINDINGS: In this study, we recorded electroencephalography (EEG) from 14 normal young adults while they performed a bilateral visual field memory task. Subjects had more rapid and accurate responses to the left visual field (LVF) memory condition. The difference in mean amplitude between the ipsilateral and contralateral event-related potential (ERP) at parietal-occipital electrodes in retention interval period was obtained with six different memory loads. Functional connectivity between 128 scalp regions was measured by EEG phase synchronization in the theta- (4-8 Hz), alpha- (8-12 Hz), beta- (12-32 Hz), and gamma- (32-40 Hz) frequency bands. The resulting matrices were converted to graphs, and mean degree, clustering coefficient and shortest path length was computed as a function of memory load. The results showed that brain networks of theta-, alpha-, beta-, and gamma- frequency bands were load-dependent and visual-field dependent. The networks of theta- and alpha- bands phase synchrony were most predominant in retention period for right visual field (RVF) WM than for LVF WM. Furthermore, only for RVF memory condition, brain network density of theta-band during the retention interval were linked to the delay of behavior reaction time, and the topological property of alpha-band network was negative correlation with behavior accuracy. CONCLUSIONS/SIGNIFICANCE: We suggest that the differences in theta- and alpha- bands between LVF and RVF conditions in functional connectivity and topological properties during retention period may result in the decline of behavioral performance in RVF task

    Cross-Frequency Integration for Consonant and Vowel Identification in Bimodal Hearing

    Get PDF
    Purpose: Improved speech recognition in binaurally combined acoustic–electric stimulation (otherwise known as bimodal hearing) could arise when listeners integrate speech cues from the acoustic and electric hearing. The aims of this study were (a) to identify speech cues extracted in electric hearing and residual acoustic hearing in the low-frequency region and (b) to investigate cochlear implant (CI) users' ability to integrate speech cues across frequencies. Method: Normal-hearing (NH) and CI subjects participated in consonant and vowel identification tasks. Each subject was tested in 3 listening conditions: CI alone (vocoder speech for NH), hearing aid (HA) alone (low-pass filtered speech for NH), and both. Integration ability for each subject was evaluated using a model of optimal integration—the PreLabeling integration model (Braida, 1991). Results: Only a few CI listeners demonstrated bimodal benefit for phoneme identification in quiet. Speech cues extracted from the CI and the HA were highly redundant for consonants but were complementary for vowels. CI listeners also exhibited reduced integration ability for both consonant and vowel identification compared with their NH counterparts. Conclusion: These findings suggest that reduced bimodal benefits in CI listeners are due to insufficient complementary speech cues across ears, a decrease in integration ability, or both.National Organization for Hearing ResearchNational Institute on Deafness and Other Communication Disorders (U.S.) (Grant R03 DC009684-01)National Institute on Deafness and Other Communication Disorders (U.S.) (Grant R01 DC007152-02

    Transfer entropy—a model-free measure of effective connectivity for the neurosciences

    Get PDF
    Understanding causal relationships, or effective connectivity, between parts of the brain is of utmost importance because a large part of the brain’s activity is thought to be internally generated and, hence, quantifying stimulus response relationships alone does not fully describe brain dynamics. Past efforts to determine effective connectivity mostly relied on model based approaches such as Granger causality or dynamic causal modeling. Transfer entropy (TE) is an alternative measure of effective connectivity based on information theory. TE does not require a model of the interaction and is inherently non-linear. We investigated the applicability of TE as a metric in a test for effective connectivity to electrophysiological data based on simulations and magnetoencephalography (MEG) recordings in a simple motor task. In particular, we demonstrate that TE improved the detectability of effective connectivity for non-linear interactions, and for sensor level MEG signals where linear methods are hampered by signal-cross-talk due to volume conduction

    Dissociable Influences of Auditory Object vs. Spatial Attention on Visual System Oscillatory Activity

    Get PDF
    Given that both auditory and visual systems have anatomically separate object identification (“what”) and spatial (“where”) pathways, it is of interest whether attention-driven cross-sensory modulations occur separately within these feature domains. Here, we investigated how auditory “what” vs. “where” attention tasks modulate activity in visual pathways using cortically constrained source estimates of magnetoencephalograpic (MEG) oscillatory activity. In the absence of visual stimuli or tasks, subjects were presented with a sequence of auditory-stimulus pairs and instructed to selectively attend to phonetic (“what”) vs. spatial (“where”) aspects of these sounds, or to listen passively. To investigate sustained modulatory effects, oscillatory power was estimated from time periods between sound-pair presentations. In comparison to attention to sound locations, phonetic auditory attention was associated with stronger alpha (7–13 Hz) power in several visual areas (primary visual cortex; lingual, fusiform, and inferior temporal gyri, lateral occipital cortex), as well as in higher-order visual/multisensory areas including lateral/medial parietal and retrosplenial cortices. Region-of-interest (ROI) analyses of dynamic changes, from which the sustained effects had been removed, suggested further power increases during Attend Phoneme vs. Location centered at the alpha range 400–600 ms after the onset of second sound of each stimulus pair. These results suggest distinct modulations of visual system oscillatory activity during auditory attention to sound object identity (“what”) vs. sound location (“where”). The alpha modulations could be interpreted to reflect enhanced crossmodal inhibition of feature-specific visual pathways and adjacent audiovisual association areas during “what” vs. “where” auditory attention

    The relationship between self-awareness of attentional status, behavioral performance and oscillatory brain rhythms

    Get PDF
    High-level cognitive factors, including self-awareness, are believed to play an important role in human visual perception. The principal aim of this study was to determine whether oscillatory brain rhythms play a role in the neural processes involved in self-monitoring attentional status. To do so we measured cortical activity using magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) while participants were asked to self-monitor their internal status, only initiating the presentation of a stimulus when they perceived their attentional focus to be maximal. We employed a hierarchical Bayesian method that uses fMRI results as soft-constrained spatial information to solve the MEG inverse problem, allowing us to estimate cortical currents in the order of millimeters and milliseconds. Our results show that, during self-monitoring of internal status, there was a sustained decrease in power within the 7-13 Hz (alpha) range in the rostral cingulate motor area (rCMA) on the human medial wall, beginning approximately 430 msec after the trial start (p < 0.05, FDR corrected). We also show that gamma-band power (41-47 Hz) within this area was positively correlated with task performance from 40-640 msec after the trial start (r = 0.71, p < 0.05). We conclude: (1) the rCMA is involved in processes governing self-monitoring of internal status; and (2) the qualitative differences between alpha and gamma activity are reflective of their different roles in self-monitoring internal states. We suggest that alpha suppression may reflect a strengthening of top-down interareal connections, while a positive correlation between gamma activity and task performance indicates that gamma may play an important role in guiding visuomotor behavior. © 2013 Yamagishi et al

    Covert Waking Brain Activity Reveals Instantaneous Sleep Depth

    Get PDF
    The neural correlates of the wake-sleep continuum remain incompletely understood, limiting the development of adaptive drug delivery systems for promoting sleep maintenance. The most useful measure for resolving early positions along this continuum is the alpha oscillation, an 8–13 Hz electroencephalographic rhythm prominent over posterior scalp locations. The brain activation signature of wakefulness, alpha expression discloses immediate levels of alertness and dissipates in concert with fading awareness as sleep begins. This brain activity pattern, however, is largely ignored once sleep begins. Here we show that the intensity of spectral power in the alpha band actually continues to disclose instantaneous responsiveness to noise—a measure of sleep depth—throughout a night of sleep. By systematically challenging sleep with realistic and varied acoustic disruption, we found that sleepers exhibited markedly greater sensitivity to sounds during moments of elevated alpha expression. This result demonstrates that alpha power is not a binary marker of the transition between sleep and wakefulness, but carries rich information about immediate sleep stability. Further, it shows that an empirical and ecologically relevant form of sleep depth is revealed in real-time by EEG spectral content in the alpha band, a measure that affords prediction on the order of minutes. This signal, which transcends the boundaries of classical sleep stages, could potentially be used for real-time feedback to novel, adaptive drug delivery systems for inducing sleep

    Cross-frequency coupling of brain oscillations in studying motivation and emotion

    Get PDF
    Research has shown that brain functions are realized by simultaneous oscillations in various frequency bands. In addition to examining oscillations in pre-specified bands, interactions and relations between the different frequency bandwidths is another important aspect that needs to be considered in unraveling the workings of the human brain and its functions. In this review we provide evidence that studying interdependencies between brain oscillations may be a valuable approach to study the electrophysiological processes associated with motivation and emotional states. Studies will be presented showing that amplitude-amplitude coupling between delta-alpha and delta-beta oscillations varies as a function of state anxiety and approach-avoidance-related motivation, and that changes in the association between delta-beta oscillations can be observed following successful psychotherapy. Together these studies suggest that cross-frequency coupling of brain oscillations may contribute to expanding our understanding of the neural processes underlying motivation and emotion
    corecore