6 research outputs found

    Behavioral results: Experiments 1 and 2.

    No full text
    <p>Behavioral accuracies averaged across 14 participants in 3°, 9°, and 15° conditions for the blocked (A) and event-related (B) designs in Experiment 1; and across 10 participants in 3°, 9°, 15°, and 80° conditions for the blocked (C) and event-related (D) designs in Experiment 2. Error bars denote 1 SEM of the mean here and in all subsequent figures.</p

    The 42 Talairach coordinates for ROI analysis in the higher-order cortex.

    No full text
    <p>The 42 Talairach coordinates for ROI analysis in the higher-order cortex.</p

    Effect of task difficulty on blood-oxygen-level-dependent signal: A functional magnetic resonance imaging study in a motion discrimination task

    No full text
    <div><p>There is much evidence that neural activity in the human brain is modulated by task difficulty, particularly in visual, frontal, and parietal cortices. However, some basic psychophysical tasks in visual perception do not give rise to this expected effect, at least not in the visual cortex. In the current study, we used functional magnetic resonance imaging (fMRI) to record brain activity while systematically manipulating task difficulty in a motion discrimination task, by varying the angular difference between the motion direction of random dots and a reference direction. We used both a blocked and an event-related design, and presented stimuli in both central and peripheral vision. The behavioral psychometric function, across angular differences of 3°, 9°, 15°, or 80°, spanned the full response range, as expected. The mean blood oxygen level dependent (BOLD) signals were also correlated within-participants between the blocked and event-related designs, across all brain areas tested. Within the visual cortex, the voxel response patterns correlated more within-conditions (e.g., 3° and 3°) than between-conditions (e.g., 3° and 9°), in both designs, further attesting to the reasonable quality of the BOLD data. Nevertheless, the BOLD-o-metric functions (i.e., BOLD activity as a function of task difficulty) were flat in the whole-brain and region-of-interest (ROI) analyses, including in the visual cortex, the parietal cortex, in both designs, and in foveal and peripheral visual fields alike. Indeed, there was little difference between BOLD activity during the 3° and 80° conditions. Some suggestive evidence of difficulty modulation was revealed only in the superior and inferior frontal gyri for the blocked design. We conclude that, in motion discrimination, there is no systematic BOLD modulation that accompanies the standard psychometric function across different hierarchies of cortical areas, except for the frontal lobe of the brain.</p></div

    MVPA results.

    No full text
    <p>The correlation between activity patterns of motion stimuli within the same condition was larger than that between different conditions in Experiment 1 (A) and in Experiment 2 (B).</p

    MRI results of ROI analysis: Experiment 2.

    No full text
    <p>BOLD amplitudes averaged across 10 participants for the 3°, 9°, 15°, and 80° conditions in V1, V2, V3, V3a, V4, MT+, and IPS for the blocked design (A) and the event-related design (B).</p

    Data_Sheet_1_Effect of facial emotion recognition learning transfers across emotions.docx

    No full text
    IntroductionPerceptual learning of facial expression is shown specific to the train expression, indicating separate encoding of the emotional contents in different expressions. However, little is known about the specificity of emotional recognition training with the visual search paradigm and the sensitivity of learning to near-threshold stimuli.MethodsIn the present study, we adopted a visual search paradigm to measure the recognition of facial expressions. In Experiment 1 (Exp1), Experiment 2 (Exp2), and Experiment 3 (Exp3), subjects were trained for 8 days to search for a target expression in an array of faces presented for 950 ms, 350 ms, and 50 ms, respectively. In Experiment 4 (Exp4), we trained subjects to search for a target of a triangle, and tested them with the task of facial expression search. Before and after the training, subjects were tested on the trained and untrained facial expressions which were presented for 950 ms, 650 ms, 350 ms, or 50 ms.ResultsThe results showed that training led to large improvements in the recognition of facial emotions only if the faces were presented long enough (Exp1: 85.89%; Exp2: 46.05%). Furthermore, the training effect could transfer to the untrained expression. However, when the faces were presented briefly (Exp3), the training effect was small (6.38%). In Exp4, the results indicated that the training effect could not transfer across categories.DiscussionOur findings revealed cross-emotion transfer for facial expression recognition training in a visual search task. In addition, learning hardly affects the recognition of near-threshold expressions.</p
    corecore