1,555 research outputs found

    Atypical neural responses to vocal anger in attention-deficit/hyperactivity disorder

    Get PDF
    Background Deficits in facial emotion processing, reported in attention-deficit/hyperactivity disorder (ADHD), have been linked to both early perceptual and later attentional components of event-related potentials (ERPs). However, the neural underpinnings of vocal emotion processing deficits in ADHD have yet to be characterised. Here, we report the first ERP study of vocal affective prosody processing in ADHD. Methods Event-related potentials of 6–11-year-old children with ADHD (n = 25) and typically developing controls (n = 25) were recorded as they completed a task measuring recognition of vocal prosodic stimuli (angry, happy and neutral). Audiometric assessments were conducted to screen for hearing impairments. Results Children with ADHD were less accurate than controls at recognising vocal anger. Relative to controls, they displayed enhanced N100 and attenuated P300 components to vocal anger. The P300 effect was reduced, but remained significant, after controlling for N100 effects by rebaselining. Only the N100 effect was significant when children with ADHD and comorbid conduct disorder (n = 10) were excluded. Conclusion This study provides the first evidence linking ADHD to atypical neural activity during the early perceptual stages of vocal anger processing. These effects may reflect preattentive hyper-vigilance to vocal anger in ADHD

    Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights

    Get PDF
    Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region

    The role of attention in the processing of emotional vocalizations: ERP insights

    Get PDF
    Dissertação de mestrado integrado em Psicologia, (área de especialização em Psicologia Clínica e da Saúde)Identificar rapidamente as emoções transmitidas pela voz e pelas faces dos outros é fundamental para um funcionamento social adequado. Vários estudos comportamentais e electrofisiológicos analisaram o processamento dos sinais usados na comunicação emocional. Contudo, a maioria utilizou faces ou palavras faladas como estímulos. Assim, ainda pouco se sabe sobre os correlatos neuronais e o desenvolvimento temporal do processamento de vocalizações não-verbais. O presente estudo usou a técnica de ERP (potenciais relacionados com eventos) para estudar o processamento de vocalizações não-verbais emocionais (alegria e raiva) e neutras num estado tardio do processamento. O componente de onda P300 foi analisado. Foi descoberto um efeito modulatório da emoção sobre a amplitude deste. Vocalizações de alegria e raiva evocaram amplitudes mais positivas para o componente P300 em comparação com neutras. Mais, foi encontrado um efeito do contexto emocional no processamento de sons neutros. Vocalizações de neutralidade num contexto de raiva evocaram um componente P300 de amplitude mais positiva em comparação com vocalizações de neutralidade num contexto de alegria. Adicionalmente, foi observada uma diferença de género: verificou-se uma amplitude mais positiva do P300 para mulheres. Estes resultados sugerem um efeito da valência dos estímulos a um nível atencional e de memória imediata, indexado pelo componente P300.Rapidly and effectively identifying the emotions conveyed by others’ faces and voices is fundamental for an adequate social functioning. Several behavioral and electrophysiological studies have analyzed the processing of emotional communication signals. Nonetheless, the majority used faces or spoken words as stimuli. Yet, little is known about the neural correlates and temporal course of the processing of non-verbal vocalizations. The present study used the ERP (Event-Related Potential) methodology to study the processing of non-verbal emotional (angry and happy) versus neutral vocalizations at a later stage of processing. The P300 component was analyzed. Its amplitude was differently modulated as a function of emotion. Angry and happy vocalizations elicited more positive amplitudes for the P300 component in comparison with neutral ones. Furthermore, emotional context was found to have an effect on the processing of neutral sounds. Neutral vocalizations in an angry context elicited a more positive amplitude P300, in comparison to neutral vocalizations in a happy context. Furthermore, a gender difference was observed: the P300 amplitude was found to be more positive for female relative to male participants. Together, these findings suggest an effect of stimulus (non-verbal vocalizations) valence at an attentional and immediate memory level, as indexed by the P300 component

    Neurophysiological Assessment of Affective Experience

    Get PDF
    In the field of Affective Computing the affective experience (AX) of the user during the interaction with computers is of great interest. The automatic recognition of the affective state, or emotion, of the user is one of the big challenges. In this proposal I focus on the affect recognition via physiological and neurophysiological signals. Long‐standing evidence from psychophysiological research and more recently from research in affective neuroscience suggests that both, body and brain physiology, are able to indicate the current affective state of a subject. However, regarding the classification of AX several questions are still unanswered. The principal possibility of AX classification was repeatedly shown, but its generalisation over different task contexts, elicitating stimuli modalities, subjects or time is seldom addressed. In this proposal I will discuss a possible agenda for the further exploration of physiological and neurophysiological correlates of AX over different elicitation modalities and task contexts

    Are Females More Responsive to Emotional Stimuli? A Neurophysiological Study Across Arousal and Valence Dimensions

    Get PDF
    Men and women seem to process emotions and react to them differently. Yet, few neurophysiological studies have systematically investigated gender differences in emotional processing. Here, we studied gender differences using Event Related Potentials (ERPs) and Skin Conductance Responses (SCR) recorded from participants who passively viewed emotional pictures selected from the International Affective Picture System (IAPS). The arousal and valence dimension of the stimuli were manipulated orthogonally. The peak amplitude and peak latency of ERP components and SCR were analyzed separately, and the scalp topographies of significant ERP differences were documented. Females responded with enhanced negative components (N100 and N200), in comparison to males, especially to the unpleasant visual stimuli, whereas both genders responded faster to high arousing or unpleasant stimuli. Scalp topographies revealed more pronounced gender differences on central and left hemisphere areas. Our results suggest a difference in the way emotional stimuli are processed by genders: unpleasant and high arousing stimuli evoke greater ERP amplitudes in women relatively to men. It also seems that unpleasant or high arousing stimuli are temporally prioritized during visual processing by both genders

    Neural Dynamics of Autistic Behaviors: Cognitive, Emotional, and Timing Substrates

    Full text link
    What brain mechanisms underlie autism and how do they give rise to autistic behavioral symptoms? This article describes a neural model, called the iSTART model, which proposes how cognitive, emotional, timing, and motor processes may interact together to create and perpetuate autistic symptoms. These model processes were originally developed to explain data concerning how the brain controls normal behaviors. The iSTART model shows how autistic behavioral symptoms may arise from prescribed breakdowns in these brain processes.Air Force Office of Scientific Research (F49620-01-1-0397); Office of Naval Research (N00014-01-1-0624

    Neurophysiological Distinction between Schizophrenia and Schizoaffective Disorder

    Get PDF
    Schizoaffective disorder (SA) is distinguished from schizophrenia (SZ) based on the presence of prominent mood symptoms over the illness course. Despite this clinical distinction, SA and SZ patients are often combined in research studies, in part because data supporting a distinct pathophysiological boundary between the disorders are lacking. Indeed, few studies have addressed whether neurobiological abnormalities associated with SZ, such as the widely replicated reduction and delay of the P300 event-related potential (ERP), are also present in SA. Scalp EEG was acquired from patients with DSM-IV SA (n = 15) or SZ (n = 22), as well as healthy controls (HC; n = 22) to assess the P300 elicited by infrequent target (15%) and task-irrelevant distractor (15%) stimuli in separate auditory and visual ”oddball” tasks. P300 amplitude was reduced and delayed in SZ, relative to HC, consistent with prior studies. These SZ abnormalities did not interact with stimulus type (target vs. task-irrelevant distractor) or modality (auditory vs. visual). Across sensory modality and stimulus type, SA patients exhibited normal P300 amplitudes (significantly larger than SZ patients and indistinguishable from HC). However, P300 latency and reaction time were both equivalently delayed in SZ and SA patients, relative to HC. P300 differences between SA and SZ patients could not be accounted for by variation in symptom severity, socio-economic status, education, or illness duration. Although both groups show similar deficits in processing speed, SA patients do not exhibit the P300 amplitude deficits evident in SZ, consistent with an underlying pathophysiological boundary between these disorders

    Psychologie und Gehirn 2007

    Get PDF
    Die Fachtagung "Psychologie und Gehirn" ist eine traditionelle Tagung aus dem Bereich psychophysiologischer Grundlagenforschung. 2007 fand diese Veranstaltung, die 33. Jahrestagung der „Deutschen Gesellschaft für Psychophysiologie und ihre Anwendungen (DGPA)“, in Dortmund unter der Schirmherrschaft des Instituts für Arbeitsphysiologie (IfADo) statt. Neben der Grundlagenforschung ist auch die Umsetzung in die Anwendung erklärtes Ziel der DGPA und dieser Tradition folgend wurden Beiträge aus vielen Bereichen moderner Neurowissenschaft (Elektrophysiologie, bildgebende Verfahren, Peripherphysiologie, Neuroendokrinologie, Verhaltensgenetik, u.a.) präsentiert und liegen hier in Kurzform vor
    corecore