381,961 research outputs found

    Auditory Experiences in Game Transfer Phenomena:

    Get PDF
    This study investigated gamers’ auditory experiences as after effects of playing. This was done by classifying, quantifying, and analysing 192 experiences from 155 gamers collected from online videogame forums. The gamers’ experiences were classified as: (i) auditory imagery (e.g., constantly hearing the music from the game), (ii) inner speech (e.g., completing phrases in the mind), (iii) auditory misperceptions (e.g., confusing real life sounds with videogame sounds), and (iv) multisensorial auditory experiences (e.g., hearing music while involuntary moving the fingers). Gamers heard auditory cues from the game in their heads, in their ears, but also coming from external sources. Occasionally, the vividness of the sound evoked thoughts and emotions that resulted in behaviours and copying strategies. The psychosocial implications of the gamers’ auditory experiences are discussed. This study contributes to the understanding of the effects of auditory features in videogames, and to the phenomenology of non-volitional experiences (e.g., auditory imagery, auditory hallucinations)

    The role of perceived source location in auditory stream segregation: separation affects sound organization, common fate does not

    Get PDF
    The human auditory system is capable of grouping sounds originating from different sound sources into coherent auditory streams, a process termed auditory stream segregation. Several cues can influence auditory stream segregation, but the full set of cues and the way in which they are integrated is still unknown. In the current study, we tested whether auditory motion can serve as a cue for segregating sequences of tones. Our hypothesis was that, following the principle of common fate, sounds emitted by sources moving together in space along similar trajectories will be more likely to be grouped into a single auditory stream, while sounds emitted by independently moving sources will more often be heard as two streams. Stimuli were derived from sound recordings in which the sound source motion was induced by walking humans. Although the results showed a clear effect of spatial separation, auditory motion had a negligible influence on stream segregation. Hence, auditory motion may not be used as a primitive cue in auditory stream segregation

    Neural Mechanisms of Selective Auditory Attention in Rats (Dissertation)

    Get PDF
    How does attention modulate sensory representations? In order to probe the underlying neural mechanisms, we established a simple rodent model of modality-specific attention. Rats were trained to perform distinct auditory two-tone discrimination and olfactory odor discrimination in a two alternative choice (2AC) paradigm. 
To determine auditory cortex’s role in this frequency discrimination task, we used GABA-A receptor agonist muscimol to transiently and reversibly inactivate auditory cortexes bilaterally in rats performing simple interleaved auditory and olfactory discrimination. With olfactory discrimination performance serving as internal control for motivation and decision making capability, we found only auditory two-tone discrimination was selectively impaired in these rats. This shows the auditory cortex is involved in this two-tone discrimination task.
To investigate the neural correlate of modality-specific attention in the auditory cortex, we trained rats to perform interleaved auditory and olfactory blocks (of 50~70 trials each) in a single session. In auditory blocks, pure tones were either presented with or without a neutral odor (caproic acid, n=2 and 3 respectively), and subjects were rewarded for discriminating auditory stimuli. In olfactory blocks, both task odors and pure tones were presented simultaneously, and subjects were rewarded for discriminating olfactory stimuli. We recorded neural responses in primary auditory cortex (area A1) in freely moving rats while subjects performed this behavior. Single unit responses to tones were heterogeneous, and included transient, sustained, and suppressed. We found 205 of 802 units recorded responsive to the stimuli we used. Of these 205 units, 18.5% showed modality-specific attentional modulation of the anticipatory activity before tone onset. In addition, we also observed in smaller proportion of units (11.2%) modality-specific attentional modulation of the tone-evoked responses; in most cases, the responses to a particular auditory stimulus was enhanced in the auditory block (or, equivalently, suppressed in the olfactory block). Attention increased choice probability of the population in the auditory block. We have also observed significant behavior choice probability in small proportions of units. 
Our results suggest that shifting attention between audition to olfaction tasks can modulate the activity of single neurons in primary auditory cortex

    Auditory Discrimination and Auditory Sensory Behaviours in Autism Spectrum Disorders

    Get PDF
    It has been hypothesised that auditory processing may be enhanced in autism spectrum disorders (ASD). We tested auditory discrimination ability in 72 adolescents with ASD (39 childhood autism; 33 other ASD) and 57 IQ and age-matched controls, assessing their capacity for successful discrimination of the frequency, intensity and duration differences in pairs of sounds.At the group level, auditory discrimination ability did not differ between the adolescents with and without ASD. However, we found a subgroup of 20% of individuals in the ASD group who showed ‘exceptional’ frequency discrimination skills (defined as 1.65 SDs above the control mean) and who were characterised by average intellectual ability and delayed language onset. Auditory sensory behaviours (i.e. behaviours in response to auditory sensory input) are common in ASD and we hypothesised that these would relate to auditory discrimination ability. For the ASD group, poor performers on the intensity discrimination task reported more auditory sensory behaviours associated with coping with loudness levels. Conversely, those who performed well on the duration discrimination task reported more auditory sensory behaviours across the full range measured. Frequency discrimination ability did not associate with auditory sensory behaviours. We therefore conclude that (i) enhanced frequency discrimination is present in around 1 in 5 individuals with ASD and may represent a specific phenotype; and (ii) individual differences in auditory discrimination ability in ASD may influence the expression of auditory sensory behaviours by modulating the degree to which sounds are detected or missed in the environment

    Progressive auditory neuropathy in patients with Leber's hereditary optic neuropathy

    Get PDF
    Objective: To investigate auditory neural involvement in patients with Leber's hereditary optic neuropathy (LHON).Methods: Auditory assessment was undertaken in two patients with LHON. One was a 45 year old woman with Harding disease (multiple-sclerosis-like illness and positive 11778mtDNA mutation) and mild auditory symptoms, whose auditory function was monitored over five years. The other was a 59 year old man with positive 11778mtDNA mutation, who presented with a long standing progressive bilateral hearing loss, moderate on one side and severe to profound on the other. Standard pure tone audiometry, tympanometry, stapedial reflex threshold measurements, stapedial reflex decay, otoacoustic emissions with olivo-cochlear suppression, auditory brain stem responses, and vestibular function tests were undertaken.Results: Both patients had good cochlear function, as judged by otoacoustic emissions ( intact outer hair cells) and normal stapedial reflexes ( intact inner hair cells). A brain stem lesion was excluded by negative findings on imaging, recordable stapedial reflex thresholds, and, in one of the patients, olivocochlear suppression of otoacoustic emissions. The deterioration of auditory function implied a progressive course in both cases. Vestibular function was unaffected.Conclusions: The findings are consistent with auditory neuropathy - a lesion of the cochlear nerve presenting with abnormal auditory brain stem responses and with normal inner hair cells and the cochlear nucleus (lower brain stem). The association of auditory neuropathy, or any other auditory dysfunction, with LHON has not been recognised previously. Further studies are necessary to establish whether this is a consistent finding

    Behavioural evidence for separate mechanisms of audiovisual temporal binding as a function of leading sensory modality

    Get PDF
    The ability to integrate auditory and visual information is critical for effective perception and interaction with the environment, and is thought to be abnormal in some clinical populations. Several studies have investigated the time window over which audiovisual events are integrated, also called the temporal binding window, and revealed asymmetries depending on the order of audiovisual input (i.e. the leading sense). When judging audiovisual simultaneity, the binding window appears narrower and non-malleable for auditory-leading stimulus pairs and wider and trainable for visual-leading pairs. Here we specifically examined the level of independence of binding mechanisms when auditory-before-visual vs. visual-before-auditory input is bound. Three groups of healthy participants practiced audiovisual simultaneity detection with feedback, selectively training on auditory-leading stimulus pairs (group 1), visual-leading stimulus pairs (group 2) or both (group 3). Subsequently, we tested for learning transfer (crossover) from trained stimulus pairs to non-trained pairs with opposite audiovisual input. Our data confirmed the known asymmetry in size and trainability for auditory–visual vs. visual–auditory binding windows. More importantly, practicing one type of audiovisual integration (e.g. auditory–visual) did not affect the other type (e.g. visual–auditory), even if trainable by within-condition practice. Together, these results provide crucial evidence that audiovisual temporal binding for auditory-leading vs. visual-leading stimulus pairs are independent, possibly tapping into different circuits for audiovisual integration due to engagement of different multisensory sampling mechanisms depending on leading sense. Our results have implications for informing the study of multisensory interactions in healthy participants and clinical populations with dysfunctional multisensory integration

    Exploring auditory-motor interactions in normal and disordered speech

    Full text link
    Auditory feedback plays an important role in speech motor learning and in the online correction of speech movements. Speakers can detect and correct auditory feedback errors at the segmental and suprasegmental levels during ongoing speech. The frontal brain regions that contribute to these corrective movements have also been shown to be more active during speech in persons who stutter (PWS) compared to fluent speakers. Further, various types of altered auditory feedback can temporarily improve the fluency of PWS, suggesting that atypical auditory-motor interactions during speech may contribute to stuttering disfluencies. To investigate this possibility, we have developed and improved Audapter, a software that enables configurable dynamic perturbation of the spatial and temporal content of the speech auditory signal in real time. Using Audapter, we have measured the compensatory responses of PWS to static and dynamic perturbations of the formant content of auditory feedback and compared these responses with those from matched fluent controls. Our findings indicate deficient utilization of auditory feedback by PWS for short-latency online control of the spatial and temporal parameters of articulation during vowel production and during running speech. These findings provide further evidence that stuttering is associated with aberrant auditory-motor integration during speech.Published versio

    Long-range coupling of prefrontal cortex and visual (MT) or polysensory (STP) cortical areas in motion perception

    Full text link
    To investigate how, where and when moving auditory cues interact with the perception of object-motion during self-motion, we conducted psychophysical, MEG, and fMRI experiments in which the subjects viewed nine textured objects during simulated forward self-motion. On each trial, one object was randomly assigned its own looming motion within the scene. Subjects reported which of four labeled objects had independent motion within the scene in two conditions: (1) visual information only and (2) with additional moving- auditory cue. In MEG, comparison of the two conditions showed: (i) MT activity is similar across conditions, (ii) late after the stimulus presentation there is additional activity in the auditory cue condition ventral to MT, (iii) with the auditory cue, the right auditory cortex (AC) shows early activity together with STS, (iv) these two activities have different time courses and the STS signals occur later in the epoch together with frontal activity in the right hemisphere, (v) for the visual-only condition activity in PPC (posterior parietal cortex) is stronger than in the auditory-cue condition. fMRI conducted for visual-only condition reveals activations in a network of parietal and frontal areas and in MT. In addition, Dynamic Granger Causality analysis showed for auditory cues a strong connection of the AC with STP but not with MT suggesting binding of visual and auditory information at STP. Also, while in the visual-only condition PFC is connected with MT, in the auditory-cue condition PFC is connected to STP (superior temporal polysensory) area. These results indicate that PFC allocates attention to the “object” as a whole, in STP to a moving visual-auditory object, and in MT to a moving visual object.Accepted manuscrip
    corecore