60 research outputs found

    Sensory quality of turnip greens and turnip tops grown in northwestern Spain

    Get PDF
    In Galicia (northwestern Spain), Brassica rapa var. rapa L. includes turnip greens and turnip tops as vegetable products. They are characterized by a particular sulfurous aroma, pungent flavor, and a bitter taste. In this work twelve local varieties grown as turnip greens and turnip tops were evaluated to define the sensory attributes, to relate them with secondary metabolites, and to select those sensorial traits that better describe these crops. Results showed differences in the sensory profiles of B. rapa varieties. Turnip greens were significantly differed for aroma intensity, leaf color, and salty taste, while turnip tops were for color and firmness of leaves, moistness and fibrosity in mouth, sharpness, and bitter taste. Secondary metabolites as glucosinolates in turnip greens and phenolic compounds in turnip tops were highly correlated with texture and flavor. Glucosinolates especially progoitrin (in turnip greens) and gluconapin (in turnip tops) showed correlation with bitter taste and aftertaste persistence. Correlation between sensory traits showed highest values between leaf firmness and stalk firmness (0.94**), leaf firmness and fibrosity (R=0.92**), aftertaste persistence and bitterness (R=0.91**) and between bitterness and moistness (R=-0.89**).Research supported by the Xunta de Galicia (PGIDIT06RAG40302PR) and Excma. Diputación Provincial de Pontevedra.Peer reviewe

    The Neural Correlates of Face-Voice-Integration in Social Anxiety Disorder

    Get PDF
    Faces and voices are very important sources of threat in social anxiety disorder (SAD), a common psychiatric disorder where core elements are fears of social exclusion and negative evaluation. Previous research in social anxiety evidenced increased cerebral responses to negative facial or vocal expressions and also generally increased hemodynamic responses to voices and faces. But it is unclear if also the cerebral process of face-voice-integration is altered in SAD. Applying functional magnetic resonance imaging, we investigated the correlates of the audiovisual integration of dynamic faces and voices in SAD as compared to healthy individuals. In the bilateral midsections of the superior temporal sulcus (STS) increased integration effects in SAD were observed driven by greater activation increases during audiovisual stimulation as compared to auditory stimulation. This effect was accompanied by increased functional connectivity with the visual association cortex and a more anterior position of the individual integration maxima along the STS in SAD. These findings demonstrate that the audiovisual integration of facial and vocal cues in SAD is not only systematically altered with regard to intensity and connectivity but also the individual location of the integration areas within the STS. These combined findings offer a novel perspective on the neuronal representation of social signal processing in individuals suffering from SAD

    Impairments in recognition of emotional facial expressions, affective prosody, and multisensory facilitation of response time in high-functioning autism

    Get PDF
    IntroductionDeficits in emotional perception are common in autistic people, but it remains unclear to which extent these perceptual impairments are linked to specific sensory modalities, specific emotions or multisensory facilitation.MethodsThis study aimed to investigate uni- and bimodal perception of emotional cues as well as multisensory facilitation in autistic (n = 18, mean age: 36.72 years, SD: 11.36) compared to non-autistic (n = 18, mean age: 36.41 years, SD: 12.18) people using auditory, visual and audiovisual stimuli.ResultsLower identification accuracy and longer response time were revealed in high-functioning autistic people. These differences were independent of modality and emotion and showed large effect sizes (Cohen’s d 0.8–1.2). Furthermore, multisensory facilitation of response time was observed in non-autistic people that was absent in autistic people, whereas no differences were found in multisensory facilitation of accuracy between the two groups.DiscussionThese findings suggest that processing of auditory and visual components of audiovisual stimuli is carried out more separately in autistic individuals (with equivalent temporal demands required for processing of the respective unimodal cues), but still with similar relative improvement in accuracy, whereas earlier integrative multimodal merging of stimulus properties seems to occur in non-autistic individuals

    "Inner voices" : the cerebral representation of emotional voice cues described in literary texts

    No full text
    While non-verbal affective voice cues are generally recognized as a crucial behavioral guide in any day-to-day conversation their role as a powerful source of information may extend well beyond close-up personal interactions and include other modes of communication such as written discourse or literature as well. Building on the assumption that similarities between the different ‘modes’ of voice cues may not only be limited to their functional role but may also include cerebral mechanisms engaged in the decoding process, the present functional magnetic resonance imaging study aimed at exploring brain responses associated with processing emotional voice signals described in literary texts. Emphasis was placed on evaluating ‘voice’ sensitive as well as task- and emotion-related modulations of brain activation frequently associated with the decoding of acoustic vocal cues. Obtained findings suggest that several similarities emerge with respect to the perception of acoustic voice signals: results identify the superior temporal, lateral and medial frontal cortex as well as the posterior cingulate cortex and cerebellum to contribute to the decoding process, with similarities to acoustic voice perception reflected in a ‘voice’-cue preference of temporal voice areas as well as an emotion-related modulation of the medial frontal cortex and a task-modulated response of the lateral frontal cortex

    Fear of Being Laughed at in Borderline Personality Disorder

    No full text
    Building on the assumption of a possible link between biases in social information processing frequently associated with borderline personality disorder (BPD) and the occurrence of gelotophobia (i.e., a fear of being laughed at), the present study aimed at evaluating the prevalence rate of gelotophobia among BPD patients. Using the Geloph<15> , a questionnaire that allows a standardized assessment of the presence and severity of gelotophobia symptoms, rates of gelotophobia were assessed in a group of 30 female BPD patients and compared to data gathered in clinical and non-clinical reference groups. Results indicate a high prevalence of gelotophobia among BPD patients with 87% of BPD patients meeting the Geloph<15> criterion for being classified as gelotophobic. Compared to other clinical and non-clinical reference groups, the rate of gelotophobia among BPD patients appears to be remarkably high, far exceeding the numbers reported for other groups in the literature to date, with 30% of BPD patients reaching extreme levels, 37% pronounced levels, and 20% slight levels of gelotophobia

    Cerebral Processing of Prosodic Emotional Signals: Evaluation of a Network Model Using rTMS

    No full text
    A great number of functional imaging studies contributed to developing a cerebral network model illustrating the processing of prosody in the brain. According to this model, the processing of prosodic emotional signals is divided into three main steps, each related to different brain areas. The present study sought to evaluate parts of the aforementioned model by using low-frequency repetitive transcranial magnetic stimulation (rTMS) over two important brain regions identified by the model: the superior temporal cortex (Experiment 1) and the inferior frontal cortex (Experiment 2). The aim of both experiments was to reduce cortical activity in the respective brain areas and evaluate whether these reductions lead to measurable behavioral effects during prosody processing. However, results obtained in this study revealed no rTMS effects on the acquired behavioral data. Possible explanations for these findings are discussed in the paper

    Cerebral integration of verbal and nonverbal emotional cues: Impact of individual nonverbal dominance

    No full text
    Emotional communication is essential for successful social interactions. Emotional information can be expressed at verbal and nonverbal levels. If the verbal message contradicts the nonverbal expression, usually the nonverbal information is perceived as being more authentic, revealing the “true feelings” of the speaker. The present fMRI study investigated the cerebral integration of verbal (sentences expressing the emotional state of the speaker) and nonverbal (facial expressions and tone of voice) emotional signals using ecologically valid audiovisual stimulus material. More specifically, cerebral activation associated with the relative impact of nonverbal information on judging the affective state of a speaker (individual nonverbal dominance index, INDI) was investigated. Perception of nonverbally expressed emotions was associated with bilateral activation within the amygdala, fusiform face area (FFA), temporal voice area (TVA), and the posterior temporal cortex as well as in the midbrain and left inferior orbitofrontal cortex (OFC)/left insula. Verbally conveyed emotions were linked to increased responses bilaterally in the TVA. Furthermore, the INDI correlated with responses in the left amygdala elicited by nonverbal and verbal emotional stimuli. Correlation of the INDI with the activation within the medial OFC was observed during the processing of communicative signals. These results suggest that individuals with a higher degree of nonverbal dominance have an increased sensitivity not only to nonverbal but to emotional stimuli in general
    corecore