77,624 research outputs found

    Assessing the impact of emotion in dual pathway models of sensory processing.

    Get PDF
    In our daily environment, we are constantly encountering an endless stream of information which we must be able to sort and prioritize. Some of the features that influence this are the emotional nature of stimuli and the emotional context of events. Emotional information is often given preferential access to neurocognitive resources, including within sensory processing systems. Interestingly, both auditory and visual systems are divided into dual processing streams; a ventral object identity/perception stream and a dorsal object location/action stream. While effects of emotion on the ventral streams are relatively well defined, its effect on dorsal stream processes remains unclear. The present thesis aimed to investigate the impact of emotion on sensory systems within a dual pathway framework of sensory processing. Study I investigated the role of emotion during auditory localization. While undergoing fMRI, participants indicated the location of an emotional or non-emotional sound within an auditory virtual environment. This revealed that the neurocognitive structures displaying activation modulated by emotion were not the same as those modulated by sound location. Emotion was represented in regions associated with the putative auditory ‘what’ but not ‘where’ stream. Study II examined the impact of emotion on ostensibly similar localization behaviours mediated differentially by the dorsal versus ventral visual processing stream. Ventrally-mediated behaviours were demonstrated to be impacted by the emotional context of a trial, while dorsally-mediated behaviours were not. For Study III, a motion-aftereffect paradigm was used to investigate the impact of emotion on visual area V5/MT+. This area, traditionally believed to be involved in dorsal stream processing, has a number of characteristics similar to a ventral stream structure. It was discovered that V5/MT+ activity was modulated both by presence of perceptual motion and emotional content of an image. In addition, this region displayed patterns of functional connectivity with the amygdala that were significantly modulated by emotion. Together, these results suggest that emotional information modulates neural processing within ventral sensory processing streams, but not dorsal processing streams. These findings are discussed with respect to current models of emotional and sensory processing, including amygdala connections to sensory cortices and emotional effects on cognition and behaviour

    The Dissociable Impact of Auditory vs. Visual Emotional Cues on Visual Processing

    Get PDF
    Background: Emotional information has privileged access to processing resources, which can cause it to have a distracting or facilitating effect on task performance for reasons that are poorly understood. The sensory modality through which it is presented may be one determining factor. Some findings suggest that auditory stimuli facilitate visual task performance while visual stimuli interfere with it, but there are conflicting findings. Hypothesis: We hypothesize that emotional content of a different sensory modality from the task improves task-related performance via a general alerting and arousing effect for all stimuli, while emotional content of the same modality disrupts performance when task-relevant neutral stimuli compete with emotional stimuli for processing resources. Methods: Participants will attempt to identify the location of a Gabor patch (a sinusoidal grating of horizontal lines), either on the left or right side of the computer screen, while a negative or neutral image or sound is presented. Their reaction times will be compared across conditions. Expected Results: We expect that emotional content presented through the auditory modality will result in faster responses on the visual perception task, compared to neutral content. Conversely, compared to neutral stimuli, emotional content presented visually will lead to slower responses. Discussion: This research will lead to a better understanding of how the manner in which emotional information is presented can determine its effect on task performance. This is a key step in determining how emotional content perceived through multiple modalities interacts to affect a person’s perceptual abilities in complex emotional situations

    The Control of Foreign Funds by the United States Treasury

    Get PDF
    The general aim of this thesis was to test the effects of paralinguistic (emotional) and prior contextual (topical) cues on perception of poorly specified visual, auditory, and audiovisual speech. The specific purposes were to (1) examine if facially displayed emotions can facilitate speechreading performance; (2) to study the mechanism for such facilitation; (3) to map information-processing factors that are involved in processing of poorly specified speech; and (4) to present a comprehensive conceptual framework for speech perception, with specification of the signal being considered. Experi¬mental and correlational designs were used, and 399 normal-hearing adults participated in seven experiments. The main conclusions are summarised as follows. (a) Speechreading can be facilitated by paralinguistic information as constituted by facial displayed emotions. (b) The facilitatory effect of emitted emotional cues is mediated by their degree of specification in transmission and ambiguity as percepts; and by how distinct the perceived emotions combined with topical cues are as cues for lexical access. (c) The facially displayed emotions affect speech perception by conveying semantic cues; no effect via enhanced articulatory distinctiveness, nor of emotion-related state in the perceiver is needed for facilitation. (d) The combined findings suggest that emotional and topical cues provide constraints for activation spreading in the lexicon. (e) Both bottom-up and top-down factors are associated with perception of poorly specified speech, indicating that variation in information-processing abilities is a crucial factor for perception if there is paucity in sensory input. A conceptual framework for speech perception, comprising specification of the linguistic and paralinguistic information, as well as distinctiveness of primes, is presented. Generalisations of the findings to other forms of paralanguage and language processing are discussed

    Transmedia Narratives in Education: The Potentials of Multisensory Emotional Arousal in Teaching and Learning Contexts

    Get PDF
    The role of the teacher has radically changed with the introduction of digital media in the different stages of the educational process. The teacher is not the deposit of knowledge anymore but acts as a dramatist that creates transmedia narratives to engage students in their access to knowledge. Teaching and learning experiences are a complex formed at least by visual, auditory, and verbal stimuli combined in specific modes stimulating multilayered sensitive emotional experiences. These experiences should be conceptualized as one interconnected complex as far as students need to develop tools for interpretation, negotiation, and meaning-making of the information they are constantly exposed to. This contribution presents an interdisciplinary pedagogical project that used transmedia narratives in the field of art education, stressing on the potentials of multisensory emotional arousal that increases the likelihood of memory consolidation, the process of creating a permanent record of the encoded information

    Audiovisual integration of emotional signals from others' social interactions

    Get PDF
    Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity

    Sound Perception:Encapsulating Intangible Voice Memories in a Physical Memento

    Get PDF
    We live in a very busy world with a variety of sensory stimulation including the olfactory, visual, tactile, and auditory. The five senses are triggered by our surroundings and help us to form meaning about the world.ⅰ Based on where someone grows up, she or he is introduced to various sites and sounds, affecting how they interpret the world. Sounds relate meaning through the association between hearing, memory and an event. Hearing is one of the learning processes, in which individuals give, receive, and store information. We typically rely on our five senses, which contribute to the process of understanding, communicating, and comprehending information. Moving beyond visual perception requires systematic attention to individual learning modalities.ⅱ Sound is one of the developing areas in the field of perception that moves beyond vision to help people understand nature, objects, narratives and varieties of perception. In order to comprehend how people hear, it is important to understand the role of perception. Sound functions as a signal, but also varies according to the capacity to hear. An individual’s physical ability to hear, and their unique experiences with sound, differ from one person to the next, and can result in a range of emotions and reactions. Certain sounds, like the voice of a loved one, also have the power to trigger emotion and convey meaning due to the association between hearing, memory and specific events from one›s past In short, the three aspects of sound perception–signal, hearing, and emotional reaction–play an integral role in auditory perception and the subjectivity of sound. However, the value of sound is often taken for granted or viewed as secondary to visual perception. This thesis will explore the value of sound perception by investigating two of its primary aspects–hearing and emotional response–in application to memory. Through a series of experiential objects, that trigger the senses. The aim is to utilize design to memorialize precious sounds in order to raise awareness about the emotional value of sound to the human experience

    Origin of symbol-using systems: speech, but not sign, without the semantic urge

    Get PDF
    Natural language—spoken and signed—is a multichannel phenomenon, involving facial and body expression, and voice and visual intonation that is often used in the service of a social urge to communicate meaning. Given that iconicity seems easier and less abstract than making arbitrary connections between sound and meaning, iconicity and gesture have often been invoked in the origin of language alongside the urge to convey meaning. To get a fresh perspective, we critically distinguish the origin of a system capable of evolution from the subsequent evolution that system becomes capable of. Human language arose on a substrate of a system already capable of Darwinian evolution; the genetically supported uniquely human ability to learn a language reflects a key contact point between Darwinian evolution and language. Though implemented in brains generated by DNA symbols coding for protein meaning, the second higher-level symbol-using system of language now operates in a world mostly decoupled from Darwinian evolutionary constraints. Examination of Darwinian evolution of vocal learning in other animals suggests that the initial fixation of a key prerequisite to language into the human genome may actually have required initially side-stepping not only iconicity, but the urge to mean itself. If sign languages came later, they would not have faced this constraint

    A neural marker for social bias toward in-group accents

    Get PDF
    Accents provide information about the speaker's geographical, socio-economic, and ethnic background. Research in applied psychology and sociolinguistics suggests that we generally prefer our own accent to other varieties of our native language and attribute more positive traits to it. Despite the widespread influence of accents on social interactions, educational and work settings the neural underpinnings of this social bias toward our own accent and, what may drive this bias, are unexplored. We measured brain activity while participants from two different geographical backgrounds listened passively to 3 English accent types embedded in an adaptation design. Cerebral activity in several regions, including bilateral amygdalae, revealed a significant interaction between the participants' own accent and the accent they listened to: while repetition of own accents elicited an enhanced neural response, repetition of the other group's accent resulted in reduced responses classically associated with adaptation. Our findings suggest that increased social relevance of, or greater emotional sensitivity to in-group accents, may underlie the own-accent bias. Our results provide a neural marker for the bias associated with accents, and show, for the first time, that the neural response to speech is partly shaped by the geographical background of the listener
    corecore