10,626 research outputs found

    What does semantic tiling of the cortex tell us about semantics?

    Get PDF
    Recent use of voxel-wise modeling in cognitive neuroscience suggests that semantic maps tile the cortex. Although this impressive research establishes distributed cortical areas active during the conceptual processing that underlies semantics, it tells us little about the nature of this processing. While mapping concepts between Marr's computational and implementation levels to support neural encoding and decoding, this approach ignores Marr's algorithmic level, central for understanding the mechanisms that implement cognition, in general, and conceptual processing, in particular. Following decades of research in cognitive science and neuroscience, what do we know so far about the representation and processing mechanisms that implement conceptual abilities? Most basically, much is known about the mechanisms associated with: (1) features and frame representations, (2) grounded, abstract, and linguistic representations, (3) knowledge-based inference, (4) concept composition, and (5) conceptual flexibility. Rather than explaining these fundamental representation and processing mechanisms, semantic tiles simply provide a trace of their activity over a relatively short time period within a specific learning context. Establishing the mechanisms that implement conceptual processing in the brain will require more than mapping it to cortical (and sub-cortical) activity, with process models from cognitive science likely to play central roles in specifying the intervening mechanisms. More generally, neuroscience will not achieve its basic goals until it establishes algorithmic-level mechanisms that contribute essential explanations to how the brain works, going beyond simply establishing the brain areas that respond to various task conditions

    Dissociation and interpersonal autonomic physiology in psychotherapy research: an integrative view encompassing psychodynamic and neuroscience theoretical frameworks

    Get PDF
    Interpersonal autonomic physiology is an interdisciplinary research field, assessing the relational interdependence of two (or more) interacting individual both at the behavioral and psychophysiological levels. Despite its quite long tradition, only eight studies since 1955 have focused on the interaction of psychotherapy dyads, and none of them have focused on the shared processual level, assessing dynamic phenomena such as dissociation. We longitudinally observed two brief psychodynamic psychotherapies, entirely audio and video-recorded (16 sessions, weekly frequency, 45 min.). Autonomic nervous system measures were continuously collected during each session. Personality, empathy, dissociative features and clinical progress measures were collected prior and post therapy, and after each clinical session. Two-independent judges, trained psychotherapist, codified the interactions\u2019 micro-processes. Time-series based analyses were performed to assess interpersonal synchronization and de-synchronization in patient\u2019s and therapist\u2019s physiological activity. Psychophysiological synchrony revealed a clear association with empathic attunement, while desynchronization phases (range of length 30-150 sec.) showed a linkage with dissociative processes, usually associated to the patient\u2019s narrative core relational trauma. Our findings are discussed under the perspective of psychodynamic models of Stern (\u201cpresent moment\u201d), Sander, Beebe and Lachmann (dyad system model of interaction), Lanius (Trauma model), and the neuroscientific frameworks proposed by Thayer (neurovisceral integration model), and Porges (polyvagal theory). The collected data allows to attempt an integration of these theoretical approaches under the light of Complex Dynamic Systems. The rich theoretical work and the encouraging clinical results might represents a new fascinating frontier of research in psychotherapy

    Lesion mapping the four-factor structure of emotional intelligence

    Get PDF
    Frontiers in Human Neuroscience 9 (2015): 649 This Document is Protected by copyright and was first published by Frontiers. All rights reserved. it is reproduced with permissionEmotional intelligence (EI) refers to an individual’s ability to process and respond to emotions, including recognizing the expression of emotions in others, using emotions to enhance thought and decision making, and regulating emotions to drive effective behaviors. Despite their importance for goal-directed social behavior, little is known about the neural mechanisms underlying specific facets of EI. Here, we report findings from a study investigating the neural bases of these specific components for EI in a sample of 130 combat veterans with penetrating traumatic brain injury. We examined the neural mechanisms underlying experiential (perceiving and using emotional information) and strategic (understanding and managing emotions) facets of EI. Factor scores were submitted to voxel-based lesion symptom mapping to elucidate their neural substrates. The results indicate that two facets of EI (perceiving and managing emotions) engage common and distinctive neural systems, with shared dependence on the social knowledge network, and selective engagement of the orbitofrontal and parietal cortex for strategic aspects of emotional information processing. The observed pattern of findings suggests that sub-facets of experiential and strategic EI can be characterized as separable but related processes that depend upon a core network of brain structures within frontal, temporal and parietal cortexThis work was supported by funding from the US National Institute of Neurological Disorders and Stroke intramural research program and a project grant from the US Army Medical Research and Materiel Command administered by the Henry M. Jackson Foundation (Vietnam Head Injury Study Phase III: a 30-year post-injury follow-up study, grant number DAMD17-01-1-0675). R. Colom was supported by grant PSI2010-20364 from Ministerio de Ciencia e Innovación [Ministry of Science and Innovation, Spain] and CEMU-2012-004 [Universidad Autonoma de Madrid

    Physical mechanisms may be as important as brain mechanisms in evolution of speech [Commentary on Ackerman, Hage, & Ziegler. Brain Mechanisms of acoustic communication in humans and nonhuman primates: an evolutionary perspective]

    No full text
    We present two arguments why physical adaptations for vocalization may be as important as neural adaptations. First, fine control over vocalization is not easy for physical reasons, and modern humans may be exceptional. Second, we present an example of a gorilla that shows rudimentary voluntary control over vocalization, indicating that some neural control is already shared with great apes

    Brain mechanisms of acoustic communication in humans and nonhuman primates: An evolutionary perspective

    Get PDF
    Any account of “what is special about the human brain” (Passingham 2008) must specify the neural basis of our unique ability to produce speech and delineate how these remarkable motor capabilities could have emerged in our hominin ancestors. Clinical data suggest that the basal ganglia provide a platform for the integration of primate-general mechanisms of acoustic communication with the faculty of articulate speech in humans. Furthermore, neurobiological and paleoanthropological data point at a two-stage model of the phylogenetic evolution of this crucial prerequisite of spoken language: (i) monosynaptic refinement of the projections of motor cortex to the brainstem nuclei that steer laryngeal muscles, presumably, as part of a “phylogenetic trend” associated with increasing brain size during hominin evolution; (ii) subsequent vocal-laryngeal elaboration of cortico-basal ganglia circuitries, driven by human-specific FOXP2 mutations.;>This concept implies vocal continuity of spoken language evolution at the motor level, elucidating the deep entrenchment of articulate speech into a “nonverbal matrix” (Ingold 1994), which is not accounted for by gestural-origin theories. Moreover, it provides a solution to the question for the adaptive value of the “first word” (Bickerton 2009) since even the earliest and most simple verbal utterances must have increased the versatility of vocal displays afforded by the preceding elaboration of monosynaptic corticobulbar tracts, giving rise to enhanced social cooperation and prestige. At the ontogenetic level, the proposed model assumes age-dependent interactions between the basal ganglia and their cortical targets, similar to vocal learning in some songbirds. In this view, the emergence of articulate speech builds on the “renaissance” of an ancient organizational principle and, hence, may represent an example of “evolutionary tinkering” (Jacob 1977)

    “It's Not What You Say, But How You Say it”: A Reciprocal Temporo-frontal Network for Affective Prosody

    Get PDF
    Humans communicate emotion vocally by modulating acoustic cues such as pitch, intensity and voice quality. Research has documented how the relative presence or absence of such cues alters the likelihood of perceiving an emotion, but the neural underpinnings of acoustic cue-dependent emotion perception remain obscure. Using functional magnetic resonance imaging in 20 subjects we examined a reciprocal circuit consisting of superior temporal cortex, amygdala and inferior frontal gyrus that may underlie affective prosodic comprehension. Results showed that increased saliency of emotion-specific acoustic cues was associated with increased activation in superior temporal cortex [planum temporale (PT), posterior superior temporal gyrus (pSTG), and posterior superior middle gyrus (pMTG)] and amygdala, whereas decreased saliency of acoustic cues was associated with increased inferior frontal activity and temporo-frontal connectivity. These results suggest that sensory-integrative processing is facilitated when the acoustic signal is rich in affective information, yielding increased activation in temporal cortex and amygdala. Conversely, when the acoustic signal is ambiguous, greater evaluative processes are recruited, increasing activation in inferior frontal gyrus (IFG) and IFG STG connectivity. Auditory regions may thus integrate acoustic information with amygdala input to form emotion-specific representations, which are evaluated within inferior frontal regions

    Brain–computer interface game applications for combined neurofeedback and biofeedback treatment for children on the autism spectrum

    Get PDF
    Individuals with Autism Spectrum Disorder (ASD) show deficits in social and communicative skills, including imitation, empathy, and shared attention, as well as restricted interests and repetitive patterns of behaviors. Evidence for and against the idea that dysfunctions in the mirror neuron system are involved in imitation and could be one underlying cause for ASD is discussed in this review. Neurofeedback interventions have reduced symptoms in children with ASD by self-regulation of brain rhythms. However, cortical deficiencies are not the only cause of these symptoms. Peripheral physiological activity, such as the heart rate, is closely linked to neurophysiological signals and associated with social engagement. Therefore, a combined approach targeting the interplay between brain, body and behavior could be more effective. Brain-computer interface applications for combined neurofeedback and biofeedback treatment for children with ASD are currently nonexistent. To facilitate their use, we have designed an innovative game that includes social interactions and provides neural- and body-based feedback that corresponds directly to the underlying significance of the trained signals as well as to the behavior that is reinforced

    Folk Explanations of Behavior: A Specialized Use of a Domain-General Mechanism

    Get PDF
    People typically explain others’ behaviors by attributing them to the beliefs and motives of an unobservable mind. Although such attributional inferences are critical for understanding the social world, it is unclear whether they rely on processes distinct from those used to understand the nonsocial world. In the present study, we used functional MRI to identify brain regions associated with making attributions about social and nonsocial situations. Attributions in both domains activated a common set of brain regions, and individual differences in the domain-specific recruitment of one of these regions—the dorsomedial prefrontal cortex (DMPFC)—correlated with attributional accuracy in each domain. Overall, however, the DMPFC showed greater activation for attributions about social than about nonsocial situations, and this selective response to the social domain was greatest in participants who reported the highest levels of social expertise. We conclude that folk explanations of behavior are an expert use of a domain-general cognitive ability

    Beyond ‘Interaction’: How to Understand Social Effects on Social Cognition

    Get PDF
    In recent years, a number of philosophers and cognitive scientists have advocated for an ‘interactive turn’ in the methodology of social-cognition research: to become more ecologically valid, we must design experiments that are interactive, rather than merely observational. While the practical aim of improving ecological validity in the study of social cognition is laudable, we think that the notion of ‘interaction’ is not suitable for this task: as it is currently deployed in the social cognition literature, this notion leads to serious conceptual and methodological confusion. In this paper, we tackle this confusion on three fronts: 1) we revise the ‘interactionist’ definition of interaction; 2) we demonstrate a number of potential methodological confounds that arise in interactive experimental designs; and 3) we show that ersatz interactivity works just as well as the real thing. We conclude that the notion of ‘interaction’, as it is currently being deployed in this literature, obscures an accurate understanding of human social cognition
    corecore