282 research outputs found

    Multisensory self-motion processing in humans

    Get PDF
    Humans obtain and process sensory information from various modalities to ensure successful navigation through the environment. While visual, vestibular, and auditory self-motion perception have been extensively investigated, studies on tac-tile self-motion perception are comparably rare. In my thesis, I have investigated tactile self-motion perception and its interaction with the visual modality. In one of two behavioral studies, I analyzed the influence of a tactile heading stimulus intro-duced as a distractor on visual heading perception. In the second behavioral study, I analyzed visuo-tactile perception of self-motion direction (heading). In both studies, visual self-motion was simulated as forward motion over a 2D ground plane. Tactile self-motion was simulated by airflow towards the subjects’ forehead, mimicking the experience of travel wind, e.g., during a bike ride. In the analysis of the subjects’ perceptual reports, I focused on possible visuo-tactile interactions and applied dif-ferent models to describe the integration of visuo-tactile heading stimuli. Lastly, in a functional magnetic resonance imaging study (fMRI), I investigated neural correlates of visual and tactile perception of traveled distance (path integration) and its modu-lation by prediction and cognitive task demands. In my first behavioral study, subjects indicated perceived heading from uni-modal visual (optic flow), unimodal tactile (tactile flow) or from a combination of stimuli from both modalities, simulating either congruent or incongruent heading (bimodal condition). In the bimodal condition, the subjects’ task was to indicate visually perceived heading. Hence, here tactile stimuli were behaviorally irrelevant. In bimodal trials, I found a significant interaction of stimuli from both modalities. Visually perceived heading was biased towards tactile heading direction for an offset of up to 10° between both heading directions. The relative weighting of stimuli from both modalities in the visuo-tactile in-teraction were examined in my second behavioral study. Subjects indicated per-ceived heading from unimodal visual, unimodal tactile and bimodal trials. Here, in bimodal trials, stimuli form both modalities were presented as behaviorally rele-vant. By varying eye- relative to head position during stimulus presentation, possi-ble influences of different reference frames of the visual and tactile modality were investigated. In different sensory modalities, incoming information is encoded rela-tive to the reference system of the receiving sensory organ (e.g., relative to the reti-na in vision or relative to the skin in somatosensation). In unimodal tactile trials, heading perception was shifted towards eye-position. In bimodal trials, varying head- and eye-position had no significant effect on perceived heading: subjects indicated perceived heading based on both, the vis-ual and tactile stimulus, independently of the behavioral relevance of the tactile stimulus. In sum, results of both studies suggest that the tactile modality plays a greater role in self-motion perception than previously thought. Besides the perception of travel direction (heading), information about trav-eled speed and duration are integrated to achieve a measure of the distance trav-eled (path integration). One previous behavioral study has shown that tactile flow can be used for the reproduction of travel distance (Churan et al., 2017). However, studies on neural correlates of tactile distance encoding in humans are lacking en-tirely. In my third study, subjects solved two path integration tasks from unimodal visual and unimodal tactile self-motion stimuli. Brain activity was measured by means of functional magnetic resonance imaging (fMRI). Both tasks varied in the engagement of cognitive task demands. In the first task, subjects replicated (Active trial) a previously observed traveled distance (Passive trial) (= Reproduction task). In the second task, subjects traveled a self-chosen distance (Active trial) which was then recorded and played back to them (Passive trial) (= Self task). The predictive coding theory postulates an internal model which creates predictions about sensory outcomes-based mismatches between predictions and sensory input which enables the system to sharpen future predictions (Teufel et al., 2018). Recent studies sug-gested a synergistical interaction between prediction and cognitive demands, there-by reversing the attenuating effect of prediction. In my study, this hypothesis was tested by manipulating cognitive demands between both tasks. For both tasks, Ac-tive trials compared to Passive trials showed BOLD enhancement of early sensory cortices and suppression of higher order areas (e.g., the intraparietal lobule (IPL)). For both modalities, enhancement of early sensory areas might facilitate task solv-ing processes at hand, thereby reversing the hypothesized attenuating effect of pre-diction. Suppression of the IPL indicates this area as an amodal comparator of pre-dictions and incoming self-motion signals. In conclusion, I was able to show that tactile self-motion information, i.e., tactile flow, provides significant information for the processing of two key features of self-motion perception: Heading and path integration. Neural correlates of tactile path-integration were investigated by means of fMRI, showing similarities between visual and tactile path integration on early processing stages as well as shared neu-ral substrates in higher order areas located in the IPL. Future studies should further investigate the perception of different self-motion parameters in the tactile modali-ty to extend the understanding of this less researched – but important – modality

    Neural Mechanisms of Sensory Integration: Frequency Domain Analysis of Spike and Field Potential Activity During Arm Position Maintenance with and Without Visual Feedback

    Get PDF
    abstract: Understanding where our bodies are in space is imperative for motor control, particularly for actions such as goal-directed reaching. Multisensory integration is crucial for reducing uncertainty in arm position estimates. This dissertation examines time and frequency-domain correlates of visual-proprioceptive integration during an arm-position maintenance task. Neural recordings were obtained from two different cortical areas as non-human primates performed a center-out reaching task in a virtual reality environment. Following a reach, animals maintained the end-point position of their arm under unimodal (proprioception only) and bimodal (proprioception and vision) conditions. In both areas, time domain and multi-taper spectral analysis methods were used to quantify changes in the spiking, local field potential (LFP), and spike-field coherence during arm-position maintenance. In both areas, individual neurons were classified based on the spectrum of their spiking patterns. A large proportion of cells in the SPL that exhibited sensory condition-specific oscillatory spiking in the beta (13-30Hz) frequency band. Cells in the IPL typically had a more diverse mix of oscillatory and refractory spiking patterns during the task in response to changing sensory condition. Contrary to the assumptions made in many modelling studies, none of the cells exhibited Poisson-spiking statistics in SPL or IPL. Evoked LFPs in both areas exhibited greater effects of target location than visual condition, though the evoked responses in the preferred reach direction were generally suppressed in the bimodal condition relative to the unimodal condition. Significant effects of target location on evoked responses were observed during the movement period of the task well. In the frequency domain, LFP power in both cortical areas was enhanced in the beta band during the position estimation epoch of the task, indicating that LFP beta oscillations may be important for maintaining the ongoing state. This was particularly evident at the population level, with clear increase in alpha and beta power. Differences in spectral power between conditions also became apparent at the population level, with power during bimodal trials being suppressed relative to unimodal. The spike-field coherence showed confounding results in both the SPL and IPL, with no clear correlation between incidence of beta oscillations and significant beta coherence.Dissertation/ThesisDoctoral Dissertation Biomedical Engineering 201

    Perceptual Strategies and Neuronal Underpinnings underlying Pattern Recognition through Visual and Tactile Sensory Modalities in Rats

    Get PDF
    The aim of my PhD project was to investigate multisensory perception and multimodal recognition abilities in the rat, to better understand the underlying perceptual strategies and neuronal mechanisms. I have chosen to carry out this project on the laboratory rat, for two reasons. First, the rat is a flexible and highly accessible experimental model, where it is possible to combine state-of-the-art neurophysiological approaches (such as multi-electrode neuronal recordings) with behavioral investigation of perception and (more in general) cognition. Second, extensive research concerning multimodal integration has already been conducted in this species, both at the neurophysiological and behavioral level. My thesis work has been organized in two projects: a psychophysical assessment of object categorization abilities in rats, and a neurophysiological study of neuronal tuning in the primary visual cortex of anaesthetized rats. In both experiments, unisensory (visual and tactile) and multisensory (visuo-tactile) stimulation has been used for training and testing, depending on the task. The first project has required development of a new experimental rig for the study of object categorization in rat, using solid objects, so as to be able to assess their recognition abilities under different modalities: vision, touch and both together. The second project involved an electrophysiological study of rat primary visual cortex, during visual, tactile and visuo-tactile stimulation, with the aim of understanding whether any interaction between these modalities exists, in an area that is mainly deputed to one of them. The results of both of the studies are still preliminary, but they already offer some interesting insights on the defining features of these abilities

    Presence 2005: the eighth annual international workshop on presence, 21-23 September, 2005 University College London (Conference proceedings)

    Get PDF
    OVERVIEW (taken from the CALL FOR PAPERS) Academics and practitioners with an interest in the concept of (tele)presence are invited to submit their work for presentation at PRESENCE 2005 at University College London in London, England, September 21-23, 2005. The eighth in a series of highly successful international workshops, PRESENCE 2005 will provide an open discussion forum to share ideas regarding concepts and theories, measurement techniques, technology, and applications related to presence, the psychological state or subjective perception in which a person fails to accurately and completely acknowledge the role of technology in an experience, including the sense of 'being there' experienced by users of advanced media such as virtual reality. The concept of presence in virtual environments has been around for at least 15 years, and the earlier idea of telepresence at least since Minsky's seminal paper in 1980. Recently there has been a burst of funded research activity in this area for the first time with the European FET Presence Research initiative. What do we really know about presence and its determinants? How can presence be successfully delivered with today's technology? This conference invites papers that are based on empirical results from studies of presence and related issues and/or which contribute to the technology for the delivery of presence. Papers that make substantial advances in theoretical understanding of presence are also welcome. The interest is not solely in virtual environments but in mixed reality environments. Submissions will be reviewed more rigorously than in previous conferences. High quality papers are therefore sought which make substantial contributions to the field. Approximately 20 papers will be selected for two successive special issues for the journal Presence: Teleoperators and Virtual Environments. PRESENCE 2005 takes place in London and is hosted by University College London. The conference is organized by ISPR, the International Society for Presence Research and is supported by the European Commission's FET Presence Research Initiative through the Presencia and IST OMNIPRES projects and by University College London

    Perception and processing of self-motion cues

    Get PDF
    The capacity of animals to navigate through familiar or novel environments depends crucially on the integration of a disparate set of self motion cues. The study begins with one of the most simple, planar visual motion, and investigates the cortical organisation of motion sensitive areas. It finds evidence of columnar organisation in hMT+ and a large scale map in V1. Chapter 3 extends this by using stimuli designed to emulate visual and auditory forward motion. It finds that participants are able to determine their direction with a precision close to that predicted by Bayesian integration. Predictions were made regarding neural processing through a modified divisive normalisation model, which was also used to fit the behavioural adaptation results. The integration of different modalities requires visual and auditory streams to combine at some stage within the sensory processing hierarchy. Previous research suggests the ventral intraparietal region (VIP) may be the seat of such integration. Chapter 4 tests whether VIP does combine these cues and whether the correlation between VIP and the unimodal regions changes depending on the coherence of unimodal stimuli. The presence of such modulation is predicted by some models, such as the divisive normalisation model. The processing of such egocentric self motion cues leads to the updating of allocentric representations, these are believed to be encoded by head direction cells and place cells. The experiment in chapter 5 uses a virtual reality stimulus during fMRI scanning to give participants the sense of moving and navigating. Their location in the virtual environment was decoded above chance from voxels in the hippocampus. No head direction signal was classified above chance from any of the three cortical regions investigated. We tentatively conclude that head direction is considerably more difficult to classify from the BOLD signal, possibly due to the homogeneous organisation of head direction cells

    Spatial cell firing during virtual navigation of open arenas by head-restrained mice

    Get PDF
    We present a mouse virtual reality (VR) system which restrains head-movements to horizontal rotations, compatible with multi-photon imaging. This system allows expression of the spatial navigation and neuronal firing patterns characteristic of real open arenas (R). Comparing VR to R: place and grid, but not head-direction, cell firing had broader spatial tuning; place, but not grid, cell firing was more directional; theta frequency increased less with running speed; whereas increases in firing rates with running speed and place and grid cells' theta phase precession were similar. These results suggest that the omni-directional place cell firing in R may require local-cues unavailable in VR, and that the scale of grid and place cell firing patterns, and theta frequency, reflect translational motion inferred from both virtual (visual and proprioceptive) and real (vestibular translation and extra-maze) cues. By contrast, firing rates and theta phase precession appear to reflect visual and proprioceptive cues alone

    Sensor Fusion in the Perception of Self-Motion

    No full text
    This dissertation has been written at the Max Planck Institute for Biological Cybernetics (Max-Planck-Institut fĂŒr Biologische Kybernetik) in TĂŒbingen in the department of Prof. Dr. Heinrich H. BĂŒlthoff. The work has universitary support by Prof. Dr. GĂŒnther Palm (University of Ulm, Abteilung Neuroinformatik). Main evaluators are Prof. Dr. GĂŒnther Palm, Prof. Dr. Wolfgang Becker (University of Ulm, Sektion Neurophysiologie) and Prof. Dr. Heinrich BĂŒlthoff.amp;lt;bramp;gt;amp;lt;bramp;gt; The goal of this thesis was to investigate the integration of different sensory modalities in the perception of self-motion, by using psychophysical methods. Experiments with healthy human participants were to be designed for and performed in the Motion Lab, which is equipped with a simulator platform and projection screen. Results from psychophysical experiments should be used to refine models of the multisensory integration process, with an mphasis on Bayesian (maximum likelihood) integration mechanisms.amp;lt;bramp;gt;amp;lt;bramp;gt; To put the psychophysical experiments into the larger framework of research on multisensory integration in the brain, results of neuroanatomical and neurophysiological experiments on multisensory integration are also reviewed

    Cortical circuits for integration of self-motion and visual-motion signals.

    Get PDF
    The cerebral cortex contains cells which respond to movement of the head, and these cells are thought to be involved in the perception of self-motion. In particular, studies in the primary visual cortex of mice show that both running speed and passive whole-body rotation modulates neuronal activity, and modern genetically targeted viral tracing approaches have begun to identify previously unknown circuits that underlie these responses. Here we review recent experimental findings and provide a road map for future work in mice to elucidate the functional architecture and emergent properties of a cortical network potentially involved in the generation of egocentric-based visual representations for navigation
    • 

    corecore