23 research outputs found

    Comparison of the ICA and voxel-wise results.

    No full text
    <p>Voxel-wise results showing significant correlation (p < 0.001) with the auditory (A) and visual (B) stimulus model are shown in green and ICs in red and blue. Yellow and cyan indicate areas where voxel-wise results overlap with one of the ICs, and white indicates areas where the voxel-wise results overlap with both two ICs. Total area of cortical surface covered by the auditory (A) and visual ICs (B) are indicated by a white line on black background.</p

    The four ICs that were sensitive to visual features in the movie.

    No full text
    <p>From left to right are shown: 1) the normalized weights used to fit the visual feature model to IC activity, 2) the activation patterns of the ICs plotted on the cortical surface, 3) mean IC time courses (dark gray), 95% confidence interval of the mean (light gray), and fitted annotation time courses (black), and 4) the coefficient of determination (R<sup>2</sup>) of the fitted model with the mean IC time course. IC3, located in occipital pole, received high weights to most visual categories, particularly to contrast edges and mechanical, global and body motion. IC4, located in posterior temporal lobe and lateral occipital lobe overlapping motion sensitive visual area V5 also received high weights for most visual categories, and especially for hand motions. IC5 is located in lateral occipital lobe and posterior parietal areas, and IC6 encompasses a network including intraparietal sulcus (IPS), frontal eye fields (FEF), ventral premotor cortex (PMC) and inferior temporo-occipital junction (TOJ). IC6 predominantly correlated with occurrence of hand and mechanical movements, whereas IC5 additionally showed preference to body motion.</p

    The two ICs that were found to be sensitive to auditory features in the movie.

    No full text
    <p>IC1 (top) encompassed bilaterally the auditory cortex (AC), superior temporal gyrus (STG), middle temporal gyrus (MTG), and relatively small activation foci in or in the vicinity of the lip representation in primary motor cortex. The normalized weights used to fit the auditory feature model to IC activity are shown in the upper left corner. The time course of the fitted stimulus model (black), mean time course (dark gray), and 95% confidence interval (light gray) of the IC are overlaid below. R<sup>2</sup> indicates the coefficient of determination of the stimulus model and IC’s temporal behavior. Vertical bars show time intervals when there is speech (red), singing (yellow), and music (blue) in the sound track. IC2 (bottom) includes the MTG and inferior frontal gyrus/pars triangularis (ptIFG) in both hemispheres as well as the left temporoparietal junction (TPJ), left premotor cortex (PMC) anterior to the motor cortex cluster of IC1, and the supplementary motor area bilaterally (SMA). A–D indicate examples of instances at which activation is not explained by the auditory model while E–I highlight moments containing speech and show peaks in brain activity. Activity patterns during these instances are shown in <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0035215#pone-0035215-g006" target="_blank">Figure 6</a>.</p

    Comparison of overlapping <i>vs.</i> non-overlapping ROIs in ICA and voxel-wise GLM results.

    No full text
    <p>A: Comparison of speech sensitive IC2 and areas correlated with the auditory model in the voxel-wise analysis. LEFT: Areas where ICA and voxel-wise results differed are color-coded with red–yellow, and overlapping areas are green. Pair-wise correlation matrix of the mean time courses of each ROI is presented on grey background, where the brightness of the grey shade corresponds to the correlation coefficient. Asterisks indicate significant correlations. RIGHT: The correlation coefficients of the auditory model with each ROIs time courses. Color coding corresponds to the colors on the brain images. All ROIs correlated strongly with each other, but only the ROIs which were present in both ICA and voxel-wise results are significantly correlated with the auditory model. B: Comparison of motion sensitive IC4 with voxel-wise results. Details as in A; iTO refers to inferior temporo-occipital ROI. Non-overlapping ROIs are not significantly correlated with the stimulus model, but all ROIs are significantly correlated with the activity of the strongest clusters of IC4 centered on the area V5.</p

    Areas showing significant correlations with single auditory and visual features.

    No full text
    <p>The color coding as in <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0035215#pone-0035215-g008" target="_blank">Figure 8</a>. Results are thresholded at p < 0.001. A: Speech explains activity in the superior temporal sulcus (STS) and middle temporal gyrus (MTG), lip representation area of the primary motor cortex (M1L) and ptIFG particularly in the left hemisphere, and dorsomedial PFC. Partially overlapping areas also show activity correlated with music. RMS energy explains activity in the superior temporal areas, particularly in a part of the posterior bank of Heschl’s gyrus (in or in the vicinity of the primary auditory cortex) and/or Planum Temporale. Other sound categories are not significantly correlated with activity in this region. The black outline indicates the area encompassing the Heschl’s gyrus (HG) of all subjects visually identified from the standardized structural images. Bar graphs show the correlation coefficients for each stimulus feature in the non-overlapping areas. Colors of the bars refer to the brain areas best activated with one of the three stimulus features. B: Hand motion activates strongly IPS and TOJ. The areas are very similar to those included in IC6 (see <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0035215#pone-0035215-g007" target="_blank">Figure 7</a>). Superior occipital cortex (sOC), occipital pole (OP), and parts of the lateral occipital cortex (lOC) show specific activity to body motion. Activity in the occipital pole also correlated with the contrast edges of the image.</p

    Annotation of visual and auditory features of a film.

    No full text
    <p>A: The presence of speech, lead singing, background singing, and music were annotated manually from the soundtrack. The zero crossing rate, spectral spread, entropy, and RMS energy sound features were extracted automatically. B: Spatial high-pass filtering was used to extract high spatial frequencies from the image to quantify to overall complexity of the image. For printing the contrast of the high-pass filtered image was increased to make the features visible. (Still images courtesy of Aki Kaurismäki and Sputnik Oy.) C: Scoring of size of body parts/objects followed the shot size convention used in cinema (long shots  =  0, medium/medium close-up shots  =  1, and close-up shots  =  2). D: Extent of motion was scored on three-step scale (no motion  =  0, intermediate motion  =  1, large motion  =  2). The overall motion score was calculated as the sum of the scores of shot size and motion strength for those time points where motion was present.</p

    Beta weights of IC1 and IC2 in single-feature GLM analysis.

    No full text
    <p>Asterisks indicate when the weights differ signigicantly from zero, or from each other. Tuning of IC1 (positive weigths) is clearly more shallow than that of IC2.</p

    Pair-wise correlations of annotated stimulus features.

    No full text
    <p>A: Correlation matrix of all auditory and visual features. Vertical bar on the left shows the grey-scale code of the correlation coefficients. B: Histogram showing the distribution of the coefficients.</p

    Performance in CPT and the episodic memory tasks.

    No full text
    EPELI (Executive Performance in Everyday LIving) is a recently developed gaming tool for objective assessment of goal-directed behavior and prospective memory (PM) in everyday contexts. This pre-registered study examined psychometric features of a new EPELI adult online version, modified from the original child version and further developed for self-administered web-based testing at home. A sample of 255 healthy adults completed EPELI where their task was to perform household chores instructed by a virtual character. The participants also filled out PM-related questionnaires and a diary and performed two conventional PM tasks and an intelligence test. We expected that the more “life-like” EPELI task would show stronger associations with conventional PM questionnaires and diary-based everyday PM reports than traditional PM tasks would do. This hypothesis did not receive support. Although EPELI was rated as more similar to everyday tasks, performance in it was not associated with the questionnaires and the diary. However, there were associations between time-monitoring behavior in EPELI and the traditional PM tasks. Taken together, online adult-EPELI was found to be a reliable method with high ecological face validity, but its convergent validity requires further research.</div
    corecore