511 research outputs found

    Crossmodal audio and tactile interaction with mobile touchscreens

    Get PDF
    Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device. This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective. A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent. Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established. The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems

    Perceptual Strategies and Neuronal Underpinnings underlying Pattern Recognition through Visual and Tactile Sensory Modalities in Rats

    Get PDF
    The aim of my PhD project was to investigate multisensory perception and multimodal recognition abilities in the rat, to better understand the underlying perceptual strategies and neuronal mechanisms. I have chosen to carry out this project on the laboratory rat, for two reasons. First, the rat is a flexible and highly accessible experimental model, where it is possible to combine state-of-the-art neurophysiological approaches (such as multi-electrode neuronal recordings) with behavioral investigation of perception and (more in general) cognition. Second, extensive research concerning multimodal integration has already been conducted in this species, both at the neurophysiological and behavioral level. My thesis work has been organized in two projects: a psychophysical assessment of object categorization abilities in rats, and a neurophysiological study of neuronal tuning in the primary visual cortex of anaesthetized rats. In both experiments, unisensory (visual and tactile) and multisensory (visuo-tactile) stimulation has been used for training and testing, depending on the task. The first project has required development of a new experimental rig for the study of object categorization in rat, using solid objects, so as to be able to assess their recognition abilities under different modalities: vision, touch and both together. The second project involved an electrophysiological study of rat primary visual cortex, during visual, tactile and visuo-tactile stimulation, with the aim of understanding whether any interaction between these modalities exists, in an area that is mainly deputed to one of them. The results of both of the studies are still preliminary, but they already offer some interesting insights on the defining features of these abilities

    Proceedings of the 1st Workshop on Multi-Sensorial Approaches to Human-Food Interaction

    Get PDF

    Sensor Fusion in the Perception of Self-Motion

    No full text
    This dissertation has been written at the Max Planck Institute for Biological Cybernetics (Max-Planck-Institut für Biologische Kybernetik) in Tübingen in the department of Prof. Dr. Heinrich H. Bülthoff. The work has universitary support by Prof. Dr. Günther Palm (University of Ulm, Abteilung Neuroinformatik). Main evaluators are Prof. Dr. Günther Palm, Prof. Dr. Wolfgang Becker (University of Ulm, Sektion Neurophysiologie) and Prof. Dr. Heinrich Bülthoff.amp;lt;bramp;gt;amp;lt;bramp;gt; The goal of this thesis was to investigate the integration of different sensory modalities in the perception of self-motion, by using psychophysical methods. Experiments with healthy human participants were to be designed for and performed in the Motion Lab, which is equipped with a simulator platform and projection screen. Results from psychophysical experiments should be used to refine models of the multisensory integration process, with an mphasis on Bayesian (maximum likelihood) integration mechanisms.amp;lt;bramp;gt;amp;lt;bramp;gt; To put the psychophysical experiments into the larger framework of research on multisensory integration in the brain, results of neuroanatomical and neurophysiological experiments on multisensory integration are also reviewed

    Low-level Modality Specific and Higher-order Amodal Processing in the Haptic and Visual Domains

    Get PDF
    The aim of the current study is to further investigate cross- and multi-modal object processing with the intent of increasing our understanding of the differential contributions of modal and amodal object processing in the visual and haptic domains. The project is an identification and information extraction study. The main factors are modality (vision or haptics), stimulus type (tools or animals) and level (naming and output). Each participant went through four different trials: Visual naming and size, Haptic naming and size. Naming consisted of verbally naming the item; Size (size comparison) consisted of verbally indicating if the current item is larger or smaller than a reference object. Stimuli consisted of plastic animals and tools. All stimuli are readily recognizable, and easily be manipulated with one hand. The actual figurines and tools were used for haptic trials, and digital photographs were used for visual trials (appendix 1 and 2). The main aim was to investigate modal and amodal processing in visual and haptic domains. The results suggest a strong effect, of modality type with visual object recognition being faster in comparison to haptic object recognition leading to a modality (visual-haptic) specific effect. It was also observed that tools were processed faster than animals regardless of the modality type. There was interaction reported between the factors supporting the notion that once naming is accomplished, if subsequent size processing, whether it is in the visual or haptic domain, results in similar reaction times this would be an indication of, non-modality specific or amodal processing. Thus, through using animal and tool figurines, we investigated modal and amodal processing in visual and haptic domains

    Exploring Perceptual Matters: A Textile-Based Approach

    Get PDF
    This research takes a practice-based approach to exploring perceptual matters that often go unnoticed in the context of everyday lived experience. My approach focuses on the experiential possibilities of knowledge emerging through artistic enquiry, and uses a variety of modes (like textiles, sound, physical computing, programming, video and text) to be conducted and communicated. It examines scholarship in line with the ecological theory of perception, and is particularly informed by neurobiological research on sensory integration as well as by cultural theories that examine the role of sensory appreciation in perception. Different processes contributing to our perceptual experience are examined through the development of a touch-sensitive, sound-generating rug and its application in an experimental context. Participants’ interaction with the rug and its sonic output allows an insight into how they make sense of multisensory information via observation of how they physically respond to it. In creating possibilities for observing the two ends of the perceptual process (sensory input and behavioural output), the rug provides a platform for the study of what is intangible to the observer (perceptual activity) through what can actually be observed (physical activity). My analysis focuses on video recordings of the experimental process and data reports obtained from the software used for the sound generating performance of the rug. Its findings suggest that attentional focus, active exploration, and past experience actively affect the ability to integrate multisensory information and are crucial parameters for the formation of a meaningful percept upon which to act. Although relational to the set experimental conditions and the specificities of the experimental group, these findings are in resonance with current cross-disciplinary discourse on perception, and indicate that art research can be incorporated into the wider arena of neurophysiological and behavioural research to expand its span of resources and methods

    Central role of somatosensory processes in sexual arousal as identified by neuroimaging techniques

    Get PDF
    Research on the neural correlates of sexual arousal is a growing field of research in affective neuroscience. A new approach studying the correlation between the hemodynamic cerebral response and autonomic genital response has enabled distinct brain areas to be identified according to their role in inducing penile erection, on the one hand, and in representing penile sensation, on the othe
    corecore