7 research outputs found
Attention Drives Synchronization of Alpha and Beta Rhythms between Right Inferior Frontal and Primary Sensory Neocortex
The right inferior frontal cortex (rIFC) is specifically associated with attentional control via the inhibition of behaviorally irrelevant stimuli and motor responses. Similarly, recent evidence has shown that alpha (7–14 Hz) and beta (15–29 Hz) oscillations in primary sensory neocortical areas are enhanced in the representation of non-attended stimuli, leading to the hypothesis that allocation of these rhythms plays an active role in optimal inattention. Here, we tested the hypothesis that selective synchronization between rIFC and primary sensory neocortex occurs in these frequency bands during inattention. We used magnetoencephalography to investigate phase synchrony between primary somatosensory (SI) and rIFC regions during a cued-attention tactile detection task that required suppression of response to uncertain distractor stimuli. Attentional modulation of synchrony between SI and rIFC was found in both the alpha and beta frequency bands. This synchrony manifested as an increase in the alpha-band early after cue between non-attended SI representations and rIFC, and as a subsequent increase in beta-band synchrony closer to stimulus processing. Differences in phase synchrony were not found in several proximal control regions. These results are the first to reveal distinct interactions between primary sensory cortex and rIFC in humans and suggest that synchrony between rIFC and primary sensory representations plays a role in the inhibition of irrelevant sensory stimuli and motor responses.National Institutes of Health (U.S.) (Grant P41RR14075)National Institutes of Health (U.S.) (Grant K25MH072941)National Institutes of Health (U.S.) (Grant K01AT003459)National Institutes of Health (U.S.) (Grant K24AT004095)National Institutes of Health (U.S.) (Grant RO1-NS045130-01)National Institutes of Health (U.S.) (Grant T32GM007484)National Science Foundation (U.S.) (Grant 0316933)National Science Foundation (U.S.). Graduate Research Fellowship Program (Grant DGE-1147470
The Connectome Visualization Utility: Software for Visualization of Human Brain Networks
In analysis of the human connectome, the connectivity of the human brain is collected from multiple imaging modalities and analyzed using graph theoretical techniques. The dimensionality of human connectivity data is high, and making sense of the complex networks in connectomics requires sophisticated visualization and analysis software. The current availability of software packages to analyze the human connectome is limited. The Connectome Visualization Utility (CVU) is a new software package designed for the visualization and network analysis of human brain networks. CVU complements existing software packages by offering expanded interactive analysis and advanced visualization features, including the automated visualization of networks in three different complementary styles and features the special visualization of scalar graph theoretical properties and modular structure. By decoupling the process of network creation from network visualization and analysis, we ensure that CVU can visualize networks from any imaging modality. CVU offers a graphical user interface, interactive scripting, and represents data uses transparent neuroimaging and matrix-based file types rather than opaque application-specific file formats
Effect of ROI order on visualization.
<p>A) Matrix and circle views for a single subject matrix constructed from correlations between BOLD signals, using an unprincipled ordering with ROIs ordered alphabetically by region name. Stronger connections are shown in orange while weaker connections are shown in blue. The node's anatomical grouping is given by the colors on the sides on the matrix. B) The same network shown using a principled anatomical ordering, beginning in medial frontal cortex, wrapping around parietal and occipital cortex, and ending at temporal pole. When the principled ordering is used, gains are especially seen in the circle view as short-range connections are grouped together to produce less visual clutter. Stronger connections are shown in dark red while weaker connections are shown in yellow. The colors on the circumference represent the same node identities as in the matrices.</p
Basic workflow to create visualizations in CVU.
<p>The cortical surface is parcellated into ROIs, while connectivity matrices can be constructed from a variety of imaging modalities. The parcellation and matrix are both loaded into CVU via the graphical user interface, shown at the bottom.</p
Visualizing scalar data in CVU.
<p>Node strength (weighted degree) is shown for the thresholded networks shown in <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0113838#pone-0113838-g002" target="_blank">Figure 2</a>. The node strength corresponds to the size and color of the node. Nodes marked by large dark green spheres have high strength, and small light green spheres represent lower strength.</p
Visualizing networks in CVU.
<p>A) Clicking on a node in the network will show only isolated connections to that node. Thresholded connectivity to a rostral middle frontal ROI is displayed. B) Networks from a single subject and constructed from three different imaging modalities are shown. Topological differences in the three networks are immediately apparent upon visualization. Stronger connections are shown in red while weaker connections are shown in yellow.</p
nipy/PySurfer: Version 0.7
Support for Python 3.3 and up.
A new alpha keyword to the Brain constructor now controls
opacity of the rendered brain surface.
The curv keyword to the Brain constructor has been
deprecated. To replicate previous behavior when curv was set to
True simply omit the curv keyword. To replicate previous
behavior when curv was set to False, simply set the
cortex keyword to None. To ease transition the curv argument
will still be caught and processed, but it will be removed in a
future release.
The cortex keyword to the Brain constructor now also accepts
a valid color specification (such as a 3-tuple with RGB values or a
color name) to render the cortical surface in that color without
rendering binary curvature values. Additionally it now also accepts
a dictionary with keyword arguments that are passed on to the call
to mlab.pipeline.surface.
Brain.save_movie now uses the imageio library, eliminating the need
to manually install ffmpeg. imageio has been added as an optional
dependency which can be installed with
$ pip install pysurfer[save_movie].
Brain.save_image now has the option to save with alpha channel and
antialiasing