17 research outputs found

    Sound Categories Are Represented as Distributed Patterns in the Human Auditory Cortex

    Get PDF
    SummaryThe ability to recognize sounds allows humans and animals to efficiently detect behaviorally relevant events, even in the absence of visual information. Sound recognition in the human brain has been assumed to proceed through several functionally specialized areas, culminating in cortical modules where category-specific processing is carried out [1–5]. In the present high-resolution fMRI experiment, we challenged this model by using well-controlled natural auditory stimuli and by employing an advanced analysis strategy based on an iterative machine-learning algorithm [6] that allows modeling of spatially distributed, as well as localized, response patterns. Sounds of cats, female singers, acoustic guitars, and tones were controlled for their time-varying spectral characteristics and presented to subjects at three different pitch levels. Sound category information—not detectable with conventional contrast-based methods analysis—could be detected with multivoxel pattern analyses and attributed to spatially distributed areas over the supratemporal cortices. A more localized pattern was observed for processing of pitch laterally to primary auditory areas. Our findings indicate that distributed neuronal populations within the human auditory cortices, including areas conventionally associated with lower-level auditory processing, entail categorical representations of sounds beyond their physical properties

    Processing of natural sounds and scenes in the human brain

    Get PDF

    Dynamic premotor-to-parietal interactions during spatial imagery

    Get PDF
    The neurobiological processes underlying mental imagery are a matter of debate and controversy among neuroscientists, cognitive psychologists, philosophers, and biologists. Recent neuroimaging studies demonstrated that the execution of mental imagery activates large frontoparietal and occipitotemporal networks in the human brain. These previous imaging studies, however, neglected the crucial interplay within and across the widely distributed cortical networks of activated brain regions. Here, we combined time-resolved event-related functional magnetic resonance imaging with analyses of interactions between brain regions (functional and effective brain connectivity) to unravel the premotor–parietal dynamics underlying spatial imagery. Participants had to sequentially construct and spatially transform a mental visual object based on either verbal or visual instructions. By concurrently accounting for the full spatiotemporal pattern of brain activity and network connectivity, we functionally segregated an early from a late premotor–parietal imagery network. Moreover, we revealed that the modality-specific information upcoming from sensory brain regions is first sent to the premotor cortex and then to the medial-dorsal parietal cortex, i.e., top-down from the motor to the perceptual pole during spatial imagery. Importantly, we demonstrate that the premotor cortex serves as the central relay station, projecting to parietal cortex at two functionally distinct stages during spatial imagery. Our approach enabled us to disentangle the multicomponential cognitive construct of mental imagery into its different cognitive subelements. We discuss and explicitly assign these mental subprocesses to each of the revealed effective brain connectivity networks and present an integrative neurobiological model of spatial imagery
    corecore