174 research outputs found

    Haptic perception in virtual reality in sighted and blind individuals

    Get PDF
    The incorporation of the sense of touch into virtual reality is an exciting development. However, research into this topic is in its infancy. This experimental programme investigated both the perception of virtual object attributes by touch and the parameters that influence touch perception in virtual reality with a force feedback device called the PHANTOM (TM) (www.sensable.com). The thesis had three main foci. Firstly, it aimed to provide an experimental account of the perception of the attributes of roughness, size and angular extent by touch via the PHANTOM (TM) device. Secondly, it aimed to contribute to the resolution of a number of other issues important in developing an understanding of the parameters that exert an influence on touch in virtual reality. Finally, it aimed to compare touch in virtual reality between sighted and blind individuals. This thesis comprises six experiments. Experiment one examined the perception of the roughness of virtual textures with the PHANTOM (TM) device. The effect of the following factors was addressed: the groove width of the textured stimuli; the endpoint used (stylus or thimble) with the PHANTOM (TM); the specific device used (PHANTOM (TM) vs. IE3000) and the visual status (sighted or blind) of the participants. Experiment two extended the findings of experiment one by addressing the impact of an exploration related factor on perceived roughness, that of the contact force an individual applies to a virtual texture. The interaction between this variable and the factors of groove width, endpoint, and visual status was also addressed. Experiment three examined the perception of the size and angular extent of virtual 3-D objects via the PHANTOM (TM). With respect to the perception of virtual object size, the effect of the following factors was addressed: the size of the object (2.7,3.6,4.5 cm); the type of virtual object (cube vs. sphere); the mode in which the virtual objects were presented; the endpoint used with the PHANTOM (TM) and the visual status of the participants. With respect to the perception of virtual object angular extent, the effect of the following factors was addressed: the angular extent of the object (18,41 and 64°); the endpoint used with the PHANTOM (TM) and the visual status of the participants. Experiment four examined the perception of the size and angular extent of real counterparts to the virtual 3-D objects used in experiment three. Experiment four manipulated the conditions under which participants examined the real objects. Participants were asked to give judgements of object size and angular extent via the deactivated PHANTOM (TM), a stylus probe, a bare index finger and without any constraints on their exploration. In addition to the above exploration type factor, experiment four examined the impact of the same factors on perceived size and angular extent in the real world as had been examined in virtual reality. Experiments five and six examined the consistency of the perception of linear extent across the 3-D axes in virtual space. Both experiments manipulated the following factors: Line extent (2.7,3.6 and 4.5cm); line dimension (x, y and z axis); movement type (active vs. passive movement) and visual status. Experiment six additionally manipulated the direction of movement within the 3-D axes. Perceived roughness was assessed by the method of magnitude estimation. The perceived size and angular extent of the various virtual stimuli and their real counterparts was assessed by the method of magnitude reproduction. This technique was also used to assess perceived extent across the 3-D axes. Touch perception via the PHANTOM (TM) was found to be broadly similar for sighted and blind participants. Touch perception in virtual reality was also found to be broadly similar between two different 3-D force feedback devices (the PHANTOM (TM) and the IE3000). However, the endpoint used with the PHANTOM (TM) device was found to exert significant, but inconsistent effects on the perception of virtual object attributes. Touch perception with the PHANTOM (TM) across the 3-D axes was found to be anisotropic in a similar way to the real world, with the illusion that radial extents were perceived as longer than equivalent tangential extents. The perception of 3-D object size and angular extent was found to be comparable between virtual reality and the real world, particularly under conditions where the participants' exploration of the real objects was constrained to a single point of contact. An intriguing touch illusion, whereby virtual objects explored from the inside were perceived to be larger than the same objects perceived from the outside was found to occur widely in virtual reality, in addition to the real world. This thesis contributes to knowledge of touch perception in virtual reality. The findings have interesting implications for theories of touch perception, both virtual and real

    Articulatory feature encoding and sensorimotor training for tactually supplemented speech reception by the hearing-impaired

    Get PDF
    Thesis (Ph. D.)--Harvard-MIT Division of Health Sciences and Technology, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 150-159).This thesis builds on previous efforts to develop tactile speech-reception aids for the hearing-impaired. Whereas conventional hearing aids mainly amplify acoustic signals, tactile speech aids convert acoustic information into a form perceptible via the sense of touch. By facilitating visual speechreading and providing sensory feedback for vocal control, tactile speech aids may substantially enhance speech communication abilities in the absence of useful hearing. Research for this thesis consisted of several lines of work. First, tactual detection and temporal order discrimination by congenitally deaf adults were examined, in order to assess the practicability of encoding acoustic speech information as temporal relationships among tactual stimuli. Temporal resolution among most congenitally deaf subjects was deemed adequate for reception of tactually-encoded speech cues. Tactual offset-order discrimination thresholds substantially exceeded those measured for onset-order, underscoring fundamental differences between stimulus masking dynamics in the somatosensory and auditory systems. Next, a tactual speech transduction scheme was designed with the aim of extending the amount of articulatory information conveyed by an earlier vocoder-type tactile speech display strategy. The novel transduction scheme derives relative amplitude cues from three frequency-filtered speech bands, preserving the cross-channel timing information required for consonant voicing discriminations, while retaining low-frequency modulations that distinguish voiced and aperiodic signal components. Additionally, a sensorimotor training approach ("directed babbling") was developed with the goal of facilitating tactile speech acquisition through frequent vocal imitation of visuo-tactile speech stimuli and attention to tactual feedback from one's own vocalizations. A final study evaluated the utility of the tactile speech display in resolving ambiguities among visually presented consonants, following either standard or enhanced sensorimotor training. Profoundly deaf and normal-hearing participants trained to exploit tactually-presented acoustic information in conjunction with visual speechreading to facilitate consonant identification in the absence of semantic context. Results indicate that the present transduction scheme can enhance reception of consonant manner and voicing information and facilitate identification of syllableinitial and syllable-final consonants. The sensorimotor training strategy proved selectively advantageous for subjects demonstrating more gradual tactual speech acquisition. Simple, low-cost tactile devices may prove suitable for widespread distribution in developing countries, where hearing aids and cochlear implants remain unaffordable for most severely and profoundly deaf individuals. They have the potential to enhance verbal communication with minimal need for clinical intervention.by Theodore M. Moallem.Ph.D

    Perceptual Experience

    Get PDF
    This book offers an account of perceptual experience—its intrinsic nature, its engagement with the world, its relations to mental states of other kinds, and its role in epistemic norms. One of the book’s main claims is that perceptual experience constitutively involves representations of worldly items. A second claim is that the relevant form of representation can be explained in broadly biological terms. After defending these foundational doctrines, the book proceeds to give an account of perceptual appearances and how they are related to the objective world. Appearances turn out to be relational, viewpoint dependent properties of external objects. There is also a complementary account of how the objects that possess these properties are represented. Another major concern is the phenomenological dimension of perception. The book maintains that perceptual phenomenology can be explained reductively in terms of the representational contents of experiences, and it uses this doctrine to undercut the traditional arguments for dualism. This treatment of perceptual phenomenology is then expanded to encompass cognitive phenomenology, the phenomenology of moods and emotions, and the phenomenology of pain. The next topic is the various forms of consciousness that perceptual experience can possess. A principal aim is to show that phenomenology is metaphysically independent of these forms of consciousness, and another is to de-mystify the form known as phenomenal consciousness. The book concludes by discussing the relations of various kinds that perceptual experiences bear to higher level cognitive states, including relations of format, content, and justification or support

    Exploring Perceptual Matters: A Textile-Based Approach

    Get PDF
    This research takes a practice-based approach to exploring perceptual matters that often go unnoticed in the context of everyday lived experience. My approach focuses on the experiential possibilities of knowledge emerging through artistic enquiry, and uses a variety of modes (like textiles, sound, physical computing, programming, video and text) to be conducted and communicated. It examines scholarship in line with the ecological theory of perception, and is particularly informed by neurobiological research on sensory integration as well as by cultural theories that examine the role of sensory appreciation in perception. Different processes contributing to our perceptual experience are examined through the development of a touch-sensitive, sound-generating rug and its application in an experimental context. Participants’ interaction with the rug and its sonic output allows an insight into how they make sense of multisensory information via observation of how they physically respond to it. In creating possibilities for observing the two ends of the perceptual process (sensory input and behavioural output), the rug provides a platform for the study of what is intangible to the observer (perceptual activity) through what can actually be observed (physical activity). My analysis focuses on video recordings of the experimental process and data reports obtained from the software used for the sound generating performance of the rug. Its findings suggest that attentional focus, active exploration, and past experience actively affect the ability to integrate multisensory information and are crucial parameters for the formation of a meaningful percept upon which to act. Although relational to the set experimental conditions and the specificities of the experimental group, these findings are in resonance with current cross-disciplinary discourse on perception, and indicate that art research can be incorporated into the wider arena of neurophysiological and behavioural research to expand its span of resources and methods

    Understanding space by moving through it: neural networks of motion- and space processing in humans

    Get PDF
    Humans explore the world by moving in it, whether moving their whole body as during walking or driving a car, or moving their arm to explore the immediate environment. During movement, self-motion cues arise from the sensorimotor system comprising vestibular, proprioceptive, visual and motor cues, which provide information about direction and speed of the movement. Such cues allow the body to keep track of its location while it moves through space. Sensorimotor signals providing self-motion information can therefore serve as a source for spatial processing in the brain. This thesis is an inquiry into human brain systems of movement and motion processing in a number of different sensory and motor modalities using functional magnetic resonance imaging (fMRI). By characterizing connections between these systems and the spatial representation system in the brain, this thesis investigated how humans understand space by moving through it. In the first study of this thesis, the recollection networks of whole-body movement were explored. Brain activation was measured during the retrieval of active and passive self-motion and retrieval of observing another person performing these tasks. Primary sensorimotor areas dominated the recollection network of active movement, while higher association areas in parietal and mid-occipital cortex were recruited during the recollection of passive transport. Common to both self-motion conditions were bilateral activations in the posterior medial temporal lobe (MTL). No MTL activations were observed during recollection of movement observation. Considering that on a behavioral level, both active and passive self-motion provide sufficient information for spatial estimations, the common activation in MTL might represent the common physiological substrate for such estimations. The second study investigated processing in the 'parahippocampal place area' (PPA), a region in the posterior MTL, during haptic exploration of spatial layout. The PPA in known to respond strongly to visuo-spatial layout. The study explored if this region is processing visuo-spatial layout specifically or spatial layout in general, independent from the encoding sensory modality. In both a cohort of sighted and blind participants, activation patterns in PPA were measured while participants haptically explored the spatial layout of model scenes or the shape of information-matched objects. Both in sighted and blind individuals, PPA activity was greater during layout exploration than during object-shape exploration. While PPA activity in the sighted could also be caused by a transformation of haptic information into a mental visual image of the layout, two points speak against this: Firstly, no increase in connectivity between the visual cortex and the PPA were observed, which would be expected if visual imagery took place. Secondly, blind participates, who cannot resort to visual imagery, showed the same pattern of PPA activity. Together, these results suggest that the PPA processes spatial layout information independent from the encoding modality. The third and last study addressed error accumulation in motion processing on different levels of the visual system. Using novel analysis methods of fMRI data, possible links between physiological properties in hMT+ and V1 and inter-individual differences in perceptual performance were explored. A correlation between noise characteristics and performance score was found in hMT+ but not V1. Better performance correlated with greater signal variability in hMT+. Though neurophysiological variability is traditionally seen as detrimental for behavioral accuracy, the results of this thesis contribute to the increasing evidence which suggests the opposite: that more efficient processing under certain circumstances can be related to more noise in neurophysiological signals. In summary, the results of this doctoral thesis contribute to our current understanding of motion and movement processing in the brain and its interface with spatial processing networks. The posterior MTL appears to be a key region for both self-motion and spatial processing. The results further indicate that physiological characteristics on the level of category-specific processing but not primary encoding reflect behavioral judgments on motion. This thesis also makes methodological contributions to the field of neuroimaging: it was found that the analysis of signal variability is a good gauge for analysing inter-individual physiological differences, while superior head-movement correction techniques have to be developed before pattern classification can be used to this end

    Perceptual Experience

    Get PDF
    This book offers an account of perceptual experience—its intrinsic nature, its engagement with the world, its relations to mental states of other kinds, and its role in epistemic norms. One of the book’s main claims is that perceptual experience constitutively involves representations of worldly items. A second claim is that the relevant form of representation can be explained in broadly biological terms. After defending these foundational doctrines, the book proceeds to give an account of perceptual appearances and how they are related to the objective world. Appearances turn out to be relational, viewpoint dependent properties of external objects. There is also a complementary account of how the objects that possess these properties are represented. Another major concern is the phenomenological dimension of perception. The book maintains that perceptual phenomenology can be explained reductively in terms of the representational contents of experiences, and it uses this doctrine to undercut the traditional arguments for dualism. This treatment of perceptual phenomenology is then expanded to encompass cognitive phenomenology, the phenomenology of moods and emotions, and the phenomenology of pain. The next topic is the various forms of consciousness that perceptual experience can possess. A principal aim is to show that phenomenology is metaphysically independent of these forms of consciousness, and another is to de-mystify the form known as phenomenal consciousness. The book concludes by discussing the relations of various kinds that perceptual experiences bear to higher level cognitive states, including relations of format, content, and justification or support

    Head-mounted Sensory Augmentation System for Navigation in Low Visibility Environments

    Get PDF
    Sensory augmentation can be used to assist in some tasks where sensory information is limited or sparse. This thesis focuses on the design and investigation of a head-mounted vibrotactile sensory augmentation interface to assist navigation in low visibility environments such as firefighters’ navigation or travel aids for visually impaired people. A novel head-mounted vibrotactile interface comprising a 1-by-7 vibrotactile display worn on the forehead is developed. A series of psychophysical studies is carried out with this display to (1) determine the vibrotactile absolute threshold, (2) investigate the accuracy of vibrotactile localization, and (3) evaluate the funneling illusion and apparent motion as sensory phenomena that could be used to communicate navigation signals. The results of these studies provide guidelines for the design of head-mounted interfaces. A 2nd generation head-mounted sensory augmentation interface called the Mark-II Tactile Helmet is developed for the application of firefighters’ navigation. It consists of a ring of ultrasound sensors mounted to the outside of a helmet, a microcontroller, two batteries and a refined vibrotactile display composed of seven vibration motors based on the results of the aforementioned psychophysical studies. A ‘tactile language’, that is, a set of distinguishable vibrotactile patterns, is developed for communicating navigation commands to the Mark-II Tactile Helmet. Four possible combinations of two command presentation modes (continuous, discrete) and two command types (recurring, single) are evaluated for their effectiveness in guiding users along a virtual wall in a structured environment. Continuous and discrete presentation modes use spatiotemporal patterns that induce the experience of apparent movement and discrete movement on the forehead, respectively. The recurring command type presents the tactile command repeatedly with an interval between patterns of 500 ms while the single command type presents the tactile command just once when there is a change in the command. The effectiveness of this tactile language is evaluated according to the objective measures of the users’ walking speed and the smoothness of their trajectory parallel to the virtual wall and subjective measures of utility and comfort employing Likert-type rating scales. The Recurring Continuous (RC) commands that exploit the phenomena of apparent motion are most effective in generating efficient routes and fast travel, and are most preferred. Finally, the optimal tactile language (RC) is compared with audio guidance using verbal instructions to investigate effectiveness in delivering navigation commands. The results show that haptic guidance leads to better performance as well as lower cognitive workload compared to auditory feedback. This research demonstrates that a head-mounted sensory augmentation interface can enhance spatial awareness in low visibility environments and could help firefighters’ navigation by providing them with supplementary sensory information

    Requirements for a tactile display of softness

    Get PDF
    Developing tactile displays is an important aspect of improving the realism of feeling softness in laparoscopic surgery. One of the major challenges of designing a tactile display is to understand how the perception of touch can be perceived with differences in material properties. This project seeks to address this limitation by investigating how the interaction of material properties affects perception of softness and to present the perception of softness through a tactile display. The first aim explores how the interaction of material properties affects perception of softness through the use of two psychophysical experiments. Experiments used a set of nine stimuli representing three materials of different compliance, with three different patterns of surface roughness or with three different coatings of stickiness. The results indicated that compliance affected perception of softness when pressing the finger, but not when sliding; and that compliance, friction and thermal conductivity all influenced the perception of softness. To achieve the second aim of reproducing various levels of softnesses, the tactile display was built at the University of Leeds. The displayed softness was controlled by changing the contact area and tension of a flexible sheet. Psychophysical experiments were conducted to evaluate how well humans perceive softness through the display. The data was analysed using MatLab to plot psychometric functions. The results indicated that the tactile display might be good for some applications which need to compare between simulated softnesses, but it might be insufficient for other applications which need to compare between simulated softness and real samples

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications
    • …
    corecore