85 research outputs found

    Compression of Auditory Space during Forward Self-Motion

    Get PDF
    <div><h3>Background</h3><p>Spatial inputs from the auditory periphery can be changed with movements of the head or whole body relative to the sound source. Nevertheless, humans can perceive a stable auditory environment and appropriately react to a sound source. This suggests that the inputs are reinterpreted in the brain, while being integrated with information on the movements. Little is known, however, about how these movements modulate auditory perceptual processing. Here, we investigate the effect of the linear acceleration on auditory space representation.</p> <h3>Methodology/Principal Findings</h3><p>Participants were passively transported forward/backward at constant accelerations using a robotic wheelchair. An array of loudspeakers was aligned parallel to the motion direction along a wall to the right of the listener. A short noise burst was presented during the self-motion from one of the loudspeakers when the listener’s physical coronal plane reached the location of one of the speakers (null point). In Experiments 1 and 2, the participants indicated which direction the sound was presented, forward or backward relative to their subjective coronal plane. The results showed that the sound position aligned with the subjective coronal plane was displaced ahead of the null point only during forward self-motion and that the magnitude of the displacement increased with increasing the acceleration. Experiment 3 investigated the structure of the auditory space in the traveling direction during forward self-motion. The sounds were presented at various distances from the null point. The participants indicated the perceived sound location by pointing a rod. All the sounds that were actually located in the traveling direction were perceived as being biased towards the null point.</p> <h3>Conclusions/Significance</h3><p>These results suggest a distortion of the auditory space in the direction of movement during forward self-motion. The underlying mechanism might involve anticipatory spatial shifts in the auditory receptive field locations driven by afferent signals from vestibular system.</p> </div

    Audio-Visual Speech Timing Sensitivity Is Enhanced in Cluttered Conditions

    Get PDF
    Events encoded in separate sensory modalities, such as audition and vision, can seem to be synchronous across a relatively broad range of physical timing differences. This may suggest that the precision of audio-visual timing judgments is inherently poor. Here we show that this is not necessarily true. We contrast timing sensitivity for isolated streams of audio and visual speech, and for streams of audio and visual speech accompanied by additional, temporally offset, visual speech streams. We find that the precision with which synchronous streams of audio and visual speech are identified is enhanced by the presence of additional streams of asynchronous visual speech. Our data suggest that timing perception is shaped by selective grouping processes, which can result in enhanced precision in temporally cluttered environments. The imprecision suggested by previous studies might therefore be a consequence of examining isolated pairs of audio and visual events. We argue that when an isolated pair of cross-modal events is presented, they tend to group perceptually and to seem synchronous as a consequence. We have revealed greater precision by providing multiple visual signals, possibly allowing a single auditory speech stream to group selectively with the most synchronous visual candidate. The grouping processes we have identified might be important in daily life, such as when we attempt to follow a conversation in a crowded room

    The Nature of Working Memory for Braille

    Get PDF
    Blind individuals have been shown on multiple occasions to compensate for their loss of sight by developing exceptional abilities in their remaining senses. While most research has been focused on perceptual abilities per se in the auditory and tactile modalities, recent work has also investigated higher-order processes involving memory and language functions. Here we examined tactile working memory for Braille in two groups of visually challenged individuals (completely blind subjects, CBS; blind with residual vision, BRV). In a first experimental procedure both groups were given a Braille tactile memory span task with and without articulatory suppression, while the BRV and a sighted group performed a visual version of the task. It was shown that the Braille tactile working memory (BrWM) of CBS individuals under articulatory suppression is as efficient as that of sighted individuals' visual working memory in the same condition. Moreover, the results suggest that BrWM may be more robust in the CBS than in the BRV subjects, thus pointing to the potential role of visual experience in shaping tactile working memory. A second experiment designed to assess the nature (spatial vs. verbal) of this working memory was then carried out with two new CBS and BRV groups having to perform the Braille task concurrently with a mental arithmetic task or a mental displacement of blocks task. We show that the disruption of memory was greatest when concurrently carrying out the mental displacement of blocks, indicating that the Braille tactile subsystem of working memory is likely spatial in nature in CBS. The results also point to the multimodal nature of working memory and show how experience can shape the development of its subcomponents

    Population genomics of Drosophila suzukii reveal longitudinal population structure and signals of migrations in and out of the continental United States

    Get PDF
    Drosophila suzukii, or spotted-wing drosophila, is now an established pest in many parts of the world, causing significant damage to numerous fruit crop industries. Native to East Asia, D. suzukii infestations started in the United States (U.S.) a decade ago, occupying a wide range of climates. To better understand invasion ecology of this pest, knowledge of past migration events, population structure, and genetic diversity is needed. In this study, we sequenced whole genomes of 237 individual flies collected across the continental U.S., as well as several sites in Europe, Brazil, and Asia, to identify and analyze hundreds of thousands of genetic markers. We observed strong population structure between Western and Eastern U.S. populations, but no evidence of any population structure between different latitudes within the continental U.S., suggesting there is no broad-scale adaptations occurring in response to differences in winter climates. We detect admixture from Hawaii to the Western U.S. and from the Eastern U.S. to Europe, in agreement with previously identified introduction routes inferred from microsatellite analysis. We also detect potential signals of admixture from the Western U.S. back to Asia, which could have important implications for shipping and quarantine policies for exported agriculture. We anticipate this large genomic dataset will spur future research into the genomic adaptations underlying D. suzukii pest activity and development of novel control methods for this agricultural pes

    Neural Correlates of Visual Motion Prediction

    Get PDF
    Predicting the trajectories of moving objects in our surroundings is important for many life scenarios, such as driving, walking, reaching, hunting and combat. We determined human subjects’ performance and task-related brain activity in a motion trajectory prediction task. The task required spatial and motion working memory as well as the ability to extrapolate motion information in time to predict future object locations. We showed that the neural circuits associated with motion prediction included frontal, parietal and insular cortex, as well as the thalamus and the visual cortex. Interestingly, deactivation of many of these regions seemed to be more closely related to task performance. The differential activity during motion prediction vs. direct observation was also correlated with task performance. The neural networks involved in our visual motion prediction task are significantly different from those that underlie visual motion memory and imagery. Our results set the stage for the examination of the effects of deficiencies in these networks, such as those caused by aging and mental disorders, on visual motion prediction and its consequences on mobility related daily activities

    Bayesian Cue Integration as a Developmental Outcome of Reward Mediated Learning

    Get PDF
    Average human behavior in cue combination tasks is well predicted by Bayesian inference models. As this capability is acquired over developmental timescales, the question arises, how it is learned. Here we investigated whether reward dependent learning, that is well established at the computational, behavioral, and neuronal levels, could contribute to this development. It is shown that a model free reinforcement learning algorithm can indeed learn to do cue integration, i.e. weight uncertain cues according to their respective reliabilities and even do so if reliabilities are changing. We also consider the case of causal inference where multimodal signals can originate from one or multiple separate objects and should not always be integrated. In this case, the learner is shown to develop a behavior that is closest to Bayesian model averaging. We conclude that reward mediated learning could be a driving force for the development of cue integration and causal inference

    Incidental sounds of locomotion in animal cognition

    Get PDF
    The highly synchronized formations that characterize schooling in fish and the flight of certain bird groups have frequently been explained as reducing energy expenditure. I present an alternative, or complimentary, hypothesis that synchronization of group movements may improve hearing perception. Although incidental sounds produced as a by-product of locomotion (ISOL) will be an almost constant presence to most animals, the impact on perception and cognition has been little discussed. A consequence of ISOL may be masking of critical sound signals in the surroundings. Birds in flight may generate significant noise; some produce wing beats that are readily heard on the ground at some distance from the source. Synchronization of group movements might reduce auditory masking through periods of relative silence and facilitate auditory grouping processes. Respiratory locomotor coupling and intermittent flight may be other means of reducing masking and improving hearing perception. A distinct border between ISOL and communicative signals is difficult to delineate. ISOL seems to be used by schooling fish as an aid to staying in formation and avoiding collisions. Bird and bat flocks may use ISOL in an analogous way. ISOL and interaction with animal perception, cognition, and synchronized behavior provide an interesting area for future study

    Tactile localization biases are modulated by gaze direction

    Get PDF
    Identifying the spatial location of touch on the skin surface is a fundamental function of our somatosensory system. Despite the fact that stimulation of even single mechanoreceptive afferent fibres is sufficient to produce clearly localised percepts, tactile localisation can be modulated also by higher-level processes such as body posture. This suggests that tactile events are coded using multiple representations using different coordinate systems. Recent reports provide evidence for systematic biases on tactile localisation task, which are thought to result from a supramodal representation of the skin surface. While the influence of non-informative vision of the body and gaze direction on tactile discrimination tasks has been extensively studied, their effects on tactile localisation tasks remain largely unexplored. To address this question, participants performed a tactile localization task on their left hand under different visual conditions by means of a mirror box; in the mirror condition a single stimulus was delivered on participants’ hand while the reflexion of the right hand was seen through the mirror; in the object condition participants looked at a box through the mirror, and in the right hand condition participants looked directly at their right hand. Participants reported the location of the tactile stimuli using a silhouette of a hand. Results showed a shift in the localization of the touches towards the tip of the fingers (distal bias) and the thumb (radial biases) across conditions. Critically, distal biases were reduced when participants looked towards the mirror compared to when they looked at their right hand suggesting that gaze direction reduces the typical proximo-distal biases in tactile localization. Moreover, vision of the hand modulates the internal configuration of points’ locations, by elongating it, in the radio-ulnar axis
    corecore