18 research outputs found

    Implicit Association Effects between Sound and Food Images: Supplementary Material

    No full text
    A growing body of empirical research documents the existence of several interesting crossmodal correspondences between auditory and gustatory/flavor stimuli, demonstrating that people can match specific acoustic and musical parameters with different tastes and flavors. In this context, a number of researchers and musicians arranged their own soundtracks so as to match specific tastes and used them for research purposes, revealing explicit crossmodal effects on judgments of taste comparative intensity or of taste/sound accordance. However, only few studies have examined implicit associations related to taste–sound correspondences. Thus, the present study was conducted in order to assess possible implicit effects associated to the crossmodal congruency/incongruency between auditory cues and food images during the classification of food tastes. To test our hypothesis, we used ‘salty’ and ‘sweet’ soundtracks with salty and sweet food images, and asked 88 participants to classify the taste of each food image while listening to the soundtracks. We found that sweet food images were classified faster than salty food images, regardless of which soundtrack was presented. Moreover, we found a congruency effect, demonstrating that such soundtracks are effective in eliciting facilitating effects of taste quality classification with congruent food images

    Means and standard errors in the two AP tests (standard and variant).

    No full text
    <p>Dependent variable is the Type of error (SAME, DIFFERENT) measure for the intervals of interest (see <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0006327#pone-0006327-t001" target="_blank">Table 1</a>).</p

    Percent errors for the left and right ear in the intervals of interest (see Tab. 1) observed in the two groups in the dichotic test.

    No full text
    <p>Percent errors for the left and right ear in the intervals of interest (see <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0006327#pone-0006327-t001" target="_blank">Tab. 1</a>) observed in the two groups in the dichotic test.</p

    Means and standard errors in the dichotic test for the left and right ears.

    No full text
    <p>Dependent variable is the Type of error (SAME, DIFFERENT) considering only the intervals of interest (see <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0006327#pone-0006327-t001" target="_blank">Table 1</a>).</p

    Percent errors in the intervals of interest (see Tab. 1) observed in the two groups in the standard and variant tests.

    No full text
    <p>Percent errors in the intervals of interest (see <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0006327#pone-0006327-t001" target="_blank">Tab. 1</a>) observed in the two groups in the standard and variant tests.</p

    Subjects gave their response by clicking with the mouse on the corresponding note presented as showed here in the standard AP test (top) and in the variant AP test (bottom).

    No full text
    <p>Subjects gave their response by clicking with the mouse on the corresponding note presented as showed here in the standard AP test (top) and in the variant AP test (bottom).</p

    Significant statistical results.

    No full text
    <p>Statistical results on the activity strength in the visual cortex, superior parietal lobe (SPL), inferior parietal lobe (IPL) and middle frontal gyrus (MFG) for selected temporal intervals. The most important statistical findings are reported in bold.</p

    MEG results.

    No full text
    <p>A: Mean normalized intensity across all conditions and subjects over time for selected ROIs. Vertical bars indicate the temporal intervals with the highest intensity values established by statistical analysis. Horizontal line indicate the cut-off value (mean+sd). B: Group mean temporal activity was averaged for all conditions for areas showing strongest activity after the matching stimulus. Coloured bars indicate temporal intervals determined previously from statistical analysis and used in the following statistical analyses to compare source activity across conditions (50–150 ms for visual cortex, 50–400 ms for superior parietal lobe, 50–550 ms for inferior parietal lobe and 350–1000 ms for middle frontal gyrus); C: Spatial maps of activations for each areas.</p

    Involved areas.

    No full text
    <p>Brodmann areas (BA) and Talairach coordinates in mm of the center of clusters.</p

    Mean points (± <i>SE</i>) allocated to each trait according to participants’ sex in Experiment 2.

    No full text
    <p>Within each “Participant’s Sex” group, means with different letters are significantly different from one another, as determined by Bonferroni-Holm post-hoc comparisons.</p
    corecore