10 research outputs found

    The emergence of semantic categorization in early visual processing: ERP indices of animal vs. artifact recognition

    Get PDF
    BACKGROUND: Neuroimaging and neuropsychological literature show functional dissociations in brain activity during processing of stimuli belonging to different semantic categories (e.g., animals, tools, faces, places), but little information is available about the time course of object perceptual categorization. The aim of the study was to provide information about the timing of processing stimuli from different semantic domains, without using verbal or naming paradigms, in order to observe the emergence of non-linguistic conceptual knowledge in the ventral stream visual pathway. Event related potentials (ERPs) were recorded in 18 healthy right-handed individuals as they performed a perceptual categorization task on 672 pairs of images of animals and man-made objects (i.e., artifacts). RESULTS: Behavioral responses to animal stimuli were ~50 ms faster and more accurate than those to artifacts. At early processing stages (120–180 ms) the right occipital-temporal cortex was more activated in response to animals than to artifacts as indexed by posterior N1 response, while frontal/central N1 (130–160) showed the opposite pattern. In the next processing stage (200–260) the response was stronger to artifacts and usable items at anterior temporal sites. The P300 component was smaller, and the central/parietal N400 component was larger to artifacts than to animals. CONCLUSION: The effect of animal and artifact categorization emerged at ~150 ms over the right occipital-temporal area as a stronger response of the ventral stream to animate, homomorphic, entities with faces and legs. The larger frontal/central N1 and the subsequent temporal activation for inanimate objects might reflect the prevalence of a functional rather than perceptual representation of manipulable tools compared to animals. Late ERP effects might reflect semantic integration and cognitive updating processes. Overall, the data are compatible with a modality-specific semantic memory account, in which sensory and action-related semantic features are represented in modality-specific brain areas

    Category specificity in mind and brain?

    No full text
    We summarise and respond to the main points made by the commentators on our target article, which concern: (1) whether structural similarity can play a causal role in normal object identification and in neuropsychological deficits for living things, (2) the nature of our structural knowledge of the world, (3) the relations between sensory and functional knowledge of objects, and the nature of our functional knowledge about living things, (4) whether we need to posit a "core" semantic system, (5) arguments that can be marshalled from evidence on functional imaging, (6) the causal mechanisms by which category differences can emerge in object representations, and (7) the nature of our knowledge about categories other than living and nonliving things. We also highlight points raised in our article that seem to be accepted

    The organization of sequential actions

    No full text

    Executive functions in name retrieval: Evidence from neuropsychology

    No full text
    "In this chapter, we will consider the links beetween executive fonction and the particular language functions involved in categoriztion and name retrieval, and we will argue that, specifically, impaired executive processes (in patients with frontal lobe lesions) can have a significant impact on language performance ...

    Visual Crowding and Category Specific Deficits: A Neural Network Model,

    No full text
    This paper describes a series of modular neural network simulations of visual object processing. In a departure from much previous work in this domain, the model described here comprises both supervised and unsupervised modules and processes real pictorial representations of items from different object categories. The unsupervised module carries out bottom-up encoding of visual stimuli, thereby developing a "perceptual" representation of each presented picture. The supervised component then classifies each perceptual representation according to a target semantic category. Model performance was assessed (1) during learning, (2) under generalisation to novel instances, and (3) after lesion damage at different stages of processing. Strong category effects were observed throughout the different experiments, with living things and musical instruments eliciting greater recognition failures relative to other categories. This pattern derives from within-category similarity effects at the level of perceptual representation and our data support the view that visual crowding can be a potentially important factor in the emergence of some category-specific impairments. The data also accord with the cascade model of object recognition, since increased competition between perceptual representations resulted in categoryspecific impairments even when the locus of damage was within the semantic component of the mode

    Sensing Aliveness

    No full text
    This study examines whether the categories animate/inanimate might be formed on the basis of information available to the cognitive system. We suggest that the discrimination of percepts according to these categories relies on proprioceptive information, which allows the perceiving subject to know that he is 'animate'. Since other 'objects' in the world exhibit movements, reactions, etc. similar to those that the subject experiences himself, he can 'project' his knowledge onto these objects and recognize them as 'animate' like himself. On this basis we try to corroborate the empricist position in the debate concerning the organization of knowledge as opposed to the nativist view. Furthermore, we argue that the categorical dichotomy animate/inanimate is more basic than other analogous ones such as living/non-living, biological/non-biological and we sketch a 'categorical stratification' following the line 'humans-animals-plants' based on the hypothesis that humans detect different degrees of 'vitality' according to the degree of similarity they recognise between the considered instance and themselves

    What Neuropsychology Tells us About Human Tool Use? The Four Constraints Theory (4CT): Mechanics, Space, Time, and Effort

    No full text
    corecore