21 research outputs found

    Brain regions significantly activated in the slow tone counting – fast tone counting contrast.

    No full text
    <p>All reported peaks significant at p<0.05 whole-brain corrected (FDR) threshold.</p>†<p> = part of a cluster of 8779 voxels.</p

    hard SART – easy SART contrast, thresholded at pFDR<.05, whole-brain corrected.

    No full text
    <p>hard SART – easy SART contrast, thresholded at pFDR<.05, whole-brain corrected.</p

    slow tone counting – fast tone counting, thresholded at pFDR<.05, whole-brain corrected.

    No full text
    <p>slow tone counting – fast tone counting, thresholded at pFDR<.05, whole-brain corrected.</p

    Areas in which both the hard – easy SART contrast and the slow – fast tone counting contrast were significantly active at pFDR<.05, whole-brain corrected.

    No full text
    <p>Areas in which both the hard – easy SART contrast and the slow – fast tone counting contrast were significantly active at pFDR<.05, whole-brain corrected.</p

    Crossed-uncrossed reaction time differences of visual and auditory modalities in musicians and non-musicians.

    No full text
    <p>CUDs in milliseconds (ms) were calculated by subtracting the mean reaction times of the uncrossed condition from the mean reaction times of the crossed condition. <b>The interaction between Musicianship and Modality is not significant (</b><b><i>p</i></b><b> =  0.22).</b></p

    What can we learn about beat perception by comparing brain signals and stimulus envelopes? - Fig 2

    No full text
    <p>For each of the 5 stimulus patterns used in [<a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0172454#pone.0172454.ref044" target="_blank">44</a>] (<b>A</b>), frequency-domain amplitudes at beat-related frequencies varied as a function of <b>B)</b> tone duration (onset/offset ramp duration was fixed at 10 ms) and <b>C)</b> onset/offset ramp duration (tone duration was fixed at 200 ms), but the functions were different for each frequency and each rhythm. See Supporting Information <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0172454#pone.0172454.s001" target="_blank">S1 Fig</a> for all tested combinations of tone duration and onset/offset ramp duration for all stimulus patterns.</p

    Comparison of envelopes (top) and stimulus spectra (bottom) obtained by using Matlab’s <i>Hilbert</i> function (left) or the MIR-implemented Hilbert transform, shown here for “Pattern 3” (from [44]).

    No full text
    <p>Since the MIR toolbox makes use of time-domain filtering, envelopes are smooth and frequency spectra differ from those obtained from the Matlab Hilbert transform. The most obvious discrepancy is at 2.5 Hz, where there is no energy in the spectrum obtained using the Matlab <i>Hilbert</i> function.</p

    Beat perception differs for rhythms with identical frequency-domain representations.

    No full text
    <p><b>A.</b> By rotating simple rhythms (i.e., playing them starting from a different point in the sequence), we created two version of rhythms with numerically identical frequency-domain representations. <b>B.</b> Beat strength ratings differed significantly between original and rotated rhythms. Individual participant data are shown in gray, and mean data are overlaid in black.</p

    Examples of the acoustic manipulations applied to one representative rhythm analyzed in Experiment 1.

    No full text
    <p>The onset structure of the original rhythm (left column) was preserved. Tone duration (middle column) and onset/offset ramp duration (right column) were parametrically varied. After obtaining the amplitude envelopes (middle row) of the stimulus waveforms (top row) via a Hilbert transform, the envelopes were transformed to the stimulus spectra in the frequency domain using a FFT (bottom row). Arrows mark the beat-related frequencies 0.416 Hz (1:12), 1.25 Hz (1:4), 2.5 Hz (1:2), and 5 Hz (1:1).</p

    Supplementary Material for the paper "Tempo Estimation from the EEG signal during Perception and Imagination of Music"

    No full text
    <p>Supplementary Material for the paper: Avital Sternin; Sebastian Stober; Jessica A. Grahn & Adrian M. Owen. <em>"Tempo Estimation from the EEG Signal during Perception and Imagination of Music."</em> In: 1st International Workshop on Brain-Computer Music Interfacing / 11th International Symposium on Computer Music Multidisciplinary Research (BCMI/CMMR’15), 2015.</p
    corecore