7 research outputs found

    When a photograph can be heard: Vision activates the auditory cortex within 110 ms

    Get PDF
    As the makers of silent movies knew well, it is not necessary to provide an actual auditory stimulus to activate the sensation of sounds typically associated with what we are viewing. Thus, you could almost hear the neigh of Rodolfo Valentino's horse, even though the film was mute. Evidence is provided that the mere sight of a photograph associated with a sound can activate the associative auditory cortex. High-density ERPs were recorded in 15 participants while they viewed hundreds of perceptually matched images that were associated (or not) with a given sound. Sound stimuli were discriminated from non-sound stimuli as early as 110 ms. SwLORETA reconstructions showed common activation of ventral stream areas for both types of stimuli and of the associative temporal cortex, at the earliest stage, only for sound stimuli. The primary auditory cortex (BA41) was also activated by sound images after ∼ 200 ms

    How Are ‘Barack Obama’ and ‘President Elect’ Differentially Stored in the Brain? An ERP Investigation on the Processing of Proper and Common Noun Pairs

    Get PDF
    BACKGROUND:One of the most debated issues in the cognitive neuroscience of language is whether distinct semantic domains are differentially represented in the brain. Clinical studies described several anomic dissociations with no clear neuroanatomical correlate. Neuroimaging studies have shown that memory retrieval is more demanding for proper than common nouns in that the former are purely arbitrary referential expressions. In this study a semantic relatedness paradigm was devised to investigate neural processing of proper and common nouns. METHODOLOGY/PRINCIPAL FINDINGS:780 words (arranged in pairs of Italian nouns/adjectives and the first/last names of well known persons) were presented. Half pairs were semantically related ("Woody Allen" or "social security"), while the others were not ("Sigmund Parodi" or "judicial cream"). All items were balanced for length, frequency, familiarity and semantic relatedness. Participants were to decide about the semantic relatedness of the two items in a pair. RTs and N400 data suggest that the task was more demanding for common nouns. The LORETA neural generators for the related-unrelated contrast (for proper names) included the left fusiform gyrus, right medial temporal gyrus, limbic and parahippocampal regions, inferior parietal and inferior frontal areas, which are thought to be involved in the conjoined processing a familiar face with the relevant episodic information. Person name was more emotional and sensory vivid than common noun semantic access. CONCLUSIONS/SIGNIFICANCE:When memory retrieval is not required, proper name access (conspecifics knowledge) is not more demanding. The neural generators of N400 to unrelated items (unknown persons and things) did not differ as a function of lexical class, thus suggesting that proper and common nouns are not treated differently as belonging to different grammatical classes

    Electrical neuroimaging evidence that spatial frequency-based selective attention affects V1 activity as early as 40-60 ms in humans

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Karns and Knight (2009) <abbrgrp><abbr bid="B1">1</abbr></abbrgrp> demonstrated by using ERP and gamma band oscillatory responses that intermodal attention modulates visual processing at the latency of the early phase of the C1 response (62-72 ms) thought to be generated in the primary visual cortex. However, the timing of attentional modulation of visual cortex during object-based attention remains a controversial issue.</p> <p>Results</p> <p>In this study, EEG recording and LORETA source reconstruction were performed. A large number of subjects (29) and of trial repetitions were used (13,312). EEG was recorded from 128 scalp sites at a sampling rate of 512 Hz. Four square-wave gratings (0.75, 1.5, 3, 6 c/deg) were randomly presented in the 4 quadrants of the visual field. Participants were instructed to pay conjoined attention to a given stimulus quadrant and spatial frequency. The C1 and P1 sensory-evoked components of ERPs were quantified by measuring their mean amplitudes across time within 5 latency ranges 40-60, 60-80, 80-100, 100-120 and 120-140 ms.</p> <p>Conclusions</p> <p>Early attention effects were found in the form of an enhanced C1 response (40-80 ms) to frequency-relevant gratings. LORETA, within its spatial resolution limits, identified the neural generators of this effect in the striate cortex (BA17), among other areas.</p
    corecore