20 research outputs found

    Substrate-specific function of the translocon-associated protein complex during translocation across the ER membrane

    Get PDF
    Although the transport of model proteins across the mammalian ER can be reconstituted with purified Sec61p complex, TRAM, and signal recognition particle receptor, some substrates, such as the prion protein (PrP), are inefficiently or improperly translocated using only these components. Here, we purify a factor needed for proper translocation of PrP and identify it as the translocon-associated protein (TRAP) complex. Surprisingly, TRAP also stimulates vectorial transport of many, but not all, other substrates in a manner influenced by their signal sequences. Comparative analyses of several natural signal sequences suggest that a dependence on TRAP for translocation is not due to any single physical parameter, such as hydrophobicity of the signal sequence. Instead, a functional property of the signal, efficiency of its post-targeting role in initiating substrate translocation, correlates inversely with TRAP dependence. Thus, maximal translocation independent of TRAP can only be achieved with a signal sequence, such as the one from prolactin, whose strong interaction with the translocon mediates translocon gating shortly after targeting. These results identify the TRAP complex as a functional component of the translocon and demonstrate that it acts in a substrate-specific manner to facilitate the initiation of protein translocation

    A Functional MRI Study of Happy and Sad Emotions in Music with and without Lyrics

    Get PDF
    Musical emotions, such as happiness and sadness, have been investigated using instrumental music devoid of linguistic content. However, pop and rock, the most common musical genres, utilize lyrics for conveying emotions. Using participants’ self-selected musical excerpts, we studied their behavior and brain responses to elucidate how lyrics interact with musical emotion processing, as reflected by emotion recognition and activation of limbic areas involved in affective experience. We extracted samples from subjects’ selections of sad and happy pieces and sorted them according to the presence of lyrics. Acoustic feature analysis showed that music with lyrics differed from music without lyrics in spectral centroid, a feature related to perceptual brightness, whereas sad music with lyrics did not diverge from happy music without lyrics, indicating the role of other factors in emotion classification. Behavioral ratings revealed that happy music without lyrics induced stronger positive emotions than happy music with lyrics. We also acquired functional magnetic resonance imaging data while subjects performed affective tasks regarding the music. First, using ecological and acoustically variable stimuli, we broadened previous findings about the brain processing of musical emotions and of songs versus instrumental music. Additionally, contrasts between sad music with versus without lyrics recruited the parahippocampal gyrus, the amygdala, the claustrum, the putamen, the precentral gyrus, the medial and inferior frontal gyri (including Broca’s area), and the auditory cortex, while the reverse contrast produced no activations. Happy music without lyrics activated structures of the limbic system and the right pars opercularis of the inferior frontal gyrus, whereas auditory regions alone responded to happy music with lyrics. These findings point to the role of acoustic cues for the experience of happiness in music and to the importance of lyrics for sad musical emotions

    A functional MRI study of happy and sad emotions in music with and without lyrics

    Get PDF
    Musical emotions,such as happiness and sadness,have been investigated using instrumental music devoid of linguistic content. However, pop and rock, the most common musical genres, utilize lyrics for conveying emotions. Using participants’ self-selected musical excerpts, we studied their behavior and brain responses to elucidate how lyrics interact with musical emotion processing, as reflected by emotion recognition and activation of limbic areas involved in affective experience. We extracted samples from subjects’ selections of sad and happy pieces and sorted them according to the presence of lyrics. Acoustic feature analysis showed that music with lyrics differed from music without lyrics in spectral centroid, a feature related toperceptual brightness,whereassadmusicwithlyricsdidnotdivergefromhappymusicwithoutlyrics,indicatingtheroleofotherfactorsinemotionclassification.Behavioralratingsrevealedthathappymusicwithoutlyricsinducedstrongerpositiveemotionsthanhappymusicwithlyrics.Wealsoacquiredfunctionalmagneticres-onanceimagingdatawhilesubjectsperformedaffectivetasksregardingthemusic.First,usingecologicalandacousticallyvariablestimuli,webroadenedpreviousfindingsaboutthebrainprocessingofmusicalemotionsandofsongsversusinstrumentalmusic.Addition-ally,contrastsbetweensadmusicwithversuswithoutlyricsrecruitedtheparahippocampalgyrus,theamygdala,theclaustrum,theputamen,theprecentralgyrus,themedialandinfe-riorfrontalgyri(includingBroca’sarea),andtheauditorycortex,whilethereversecontrastproducednoactivations.Happymusicwithoutlyricsactivatedstructuresofthelimbicsys-temandtherightparsopercularisoftheinferiorfrontalgyrus,whereasauditoryregionsalonerespondedtohappymusicwithlyrics.Thesefindingspointtotheroleofacousticcuesfortheexperienceofhappinessinmusicandtotheimportanceoflyricsforsadmusicalemotions

    Fractionating auditory priors : A neural dissociation between active and passive experience of musical sounds

    Get PDF
    Learning, attention and action play a crucial role in determining how stimulus predictions are formed, stored, and updated. Years-long experience with the specific repertoires of sounds of one or more musical styles is what characterizes professional musicians. Here we contrasted active experience with sounds, namely long-lasting motor practice, theoretical study and engaged listening to the acoustic features characterizing a musical style of choice in professional musicians with mainly passive experience of sounds in laypersons. We hypothesized that long-term active experience of sounds would influence the neural predictions of the stylistic features in professional musicians in a distinct way from the mainly passive experience of sounds in laypersons. Participants with different musical backgrounds were recruited: professional jazz and classical musicians, amateur musicians and non-musicians. They were presented with a musical multi-feature paradigm eliciting mismatch negativity (MMN), a prediction error signal to changes in six sound features for only 12 minutes of electroencephalography (EEG) and magnetoencephalography (MEG) recordings. We observed a generally larger MMN amplitudes-indicative of stronger automatic neural signals to violated priors-in jazz musicians (but not in classical musicians) as compared to non-musicians and amateurs. The specific MMN enhancements were found for spectral features (timbre, pitch, slide) and sound intensity. In participants who were not musicians, the higher preference for jazz music was associated with reduced MMN to pitch slide (a feature common in jazz music style). Our results suggest that long-lasting, active experience of a musical style is associated with accurate neural priors for the sound features of the preferred style, in contrast to passive listening.Peer reviewe

    The organization of engaged and quiescent translocons in the endoplasmic reticulum of mammalian cells

    Get PDF
    Protein translocons of the mammalian endoplasmic reticulum are composed of numerous functional components whose organization during different stages of the transport cycle in vivo remains poorly understood. We have developed generally applicable methods based on fluorescence resonance energy transfer (FRET) to probe the relative proximities of endogenously expressed translocon components in cells. Examination of substrate-engaged translocons revealed oligomeric assemblies of the Sec61 complex that were associated to varying degrees with other essential components including the signal recognition particle receptor TRAM and the TRAP complex. Remarkably, these components not only remained assembled but also had a similar, yet distinguishable, organization both during and after nascent chain translocation. The persistence of preassembled and complete translocons between successive rounds of transport may facilitate highly efficient translocation in vivo despite temporal constraints imposed by ongoing translation and a crowded cellular environment

    Hidden sources of joy, fear, and sadness : Explicit versus implicit neural processing of musical emotions

    Get PDF
    Music is often used to regulate emotions and mood. Typically, music conveys and induces emotions even when one does not attend to them. Studies on the neural substrates of musical emotions have, however, only examined brain activity when subjects have focused on the emotional content of the music. Here we address with functional magnetic resonance imaging (fMRI) the neural processing of happy, sad, and fearful music with a paradigm in which 56 subjects were instructed to either classify the emotions (explicit condition) or pay attention to the number of instruments playing (implicit condition) in 4-s music clips. In the implicit vs. explicit condition, stimuli activated bilaterally the inferior parietal lobule, premotor cortex, caudate, and ventromedial frontal areas. The cortical dorsomedial prefrontal and occipital areas activated during explicit processing were those previously shown to be associated with the cognitive processing of music and emotion recognition and regulation. Moreover, happiness in music was associated with activity in the bilateral auditory cortex, left parahippocampal gyrus, and supplementary motor area, whereas the negative emotions of sadness and fear corresponded with activation of the left anterior cingulate and middle frontal gyrus and down-regulation of the orbitofrontal cortex. Our study demonstrates for the first time in healthy subjects the neural underpinnings of the implicit processing of brief musical emotions, particularly in frontoparietal, dorsolateral prefrontal, and striatal areas of the brain. (C) 2016 Elsevier Ltd. All rights reserved.Peer reviewe

    Maladaptive and adaptive emotion regulation through music : a behavioral and neuroimaging study of males and females

    Get PDF
    Music therapists use guided affect regulation in the treatment of mood disorders. However, self-directed uses of music in affect regulation are not fully understood. Some uses of music may have negative effects on mental health, as can non music regulation strategies, such as rumination. Psychological testing and functional magnetic resonance imaging (fMRI) were used explore music listening strategies in relation to mental health. Participants (n = 123) were assessed for depression, anxiety and Neuroticism, and uses of Music in Mood Regulation (MMR). Neural responses to music were measured in the medial prefrontal cortex (mPFC) in a subset of participants (n = 56). Discharge, using music to express negative emotions, related to increased anxiety and Neuroticism in all participants and particularly in males. Males high in Discharge showed decreased activity of mPFC during music listening compared with those using less Discharge. Females high in Diversion, using music to distract from negative emotions, showed more mPFC activity than females using less Diversion. These results suggest that the use of Discharge strategy can be associated with maladaptive patterns of emotional regulation, and may even have long-term negative effects on mental health. This finding has real-world applications in psychotherapy and particularly in clinical music therapy.Peer reviewe

    Toward a neural chronometry for the aesthetic experience of music

    Get PDF
    Music is often studied as a cognitive domain alongside language. The emotional aspects of music have also been shown to be important, but views on their nature diverge. For instance, the specific emotions that music induces and how they relate to emotional expression are still under debate. Here we propose a mental and neural chronometry of the aesthetic experience of music initiated and mediated by external and internal contexts such as intentionality, background mood, attention, and expertise. The initial stages necessary for an aesthetic experience of music are feature analysis, integration across modalities, and cognitive processing on the basis of long-term knowledge. These stages are common to individuals belonging to the same musical culture. The initial emotional reactions to music include the startle reflex, core "liking," and arousal. Subsequently, discrete emotions are perceived and induced. Presumably somatomotor processes synchronizing the body with the music also come into play here. The subsequent stages, in which cognitive, affective, and decisional processes intermingle, require controlled cross-modal neural processes to result in aesthetic emotions, aesthetic judgments, and conscious liking. These latter aesthetic stages often require attention, intentionality, and expertise for their full actualization.Peer reviewe
    corecore