483 research outputs found

    A tactile communication system for navigation

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2005.Includes bibliographical references (leaves 42-43).A vibrotactile display for use in navigation has been designed and evaluated. The arm and the torso, which offer relatively large and flat surface areas, were chosen as locations for the displays. The ability of subjects to identify patterns of vibrotactile stimulation on the arm and torso was tested in a series of experiments using the vibrotactile displays. A variety of patterns of stimulation was evaluated to determine which was most effective, and the efficacy of two types of motors (pancake and cylindrical) was compared. The arm display was tested with sedentary subjects in the laboratory, and the torso display was tested both in the laboratory with sedentary subjects and outdoors with active subjects. The results indicated that identification of the vibrotactile patterns was superior on the torso as compared to the forearm, with subjects achieving 99-100% accuracy with seven of the eight patterns presented. The torso display was equally effective for both sedentary and active subjects.by Erin M. Piateski.S.M

    The Neural Basis of Somatosensory Remapping Develops in Human Infancy

    Get PDF
    When we sense a touch, our brains take account of our current limb position to determine the location of that touch in external space [1, 2]. Here we show that changes in the way the brain processes somatosensory information in the first year of life underlie the origins of this ability [3]. In three experiments we recorded somatosensory evoked potentials (SEPs) from 6.5-, 8-, and 10-month-old infants while presenting vibrotactile stimuli to their hands across uncrossed- and crossed-hands postures. At all ages we observed SEPs over central regions contralateral to the stimulated hand. Somatosensory processing was influenced by arm posture from 8 months onward. At 8 months, posture influenced mid-latency SEP components, but by 10 months effects were observed at early components associated with feed-forward stages of somatosensory processing. Furthermore, sight of the hands was a necessary pre-requisite for somatosensory remapping at 10 months. Thus, the cortical networks [4] underlying the ability to dynamically update the location of a perceived touch across limb movements become functional during the first year of life. Up until at least 6.5 months of age, it seems that human infants’ perceptions of tactile stimuli in the external environment are heavily dependent upon limb position

    The localisation of pain on the body : an experimental analysis

    Get PDF

    Effects of Vibrotactile Display Position and Shape on Extrapersonal Localization

    Get PDF
    Vibrotactile displays are capable of conveying extrapersonal spatial information to users navigating or operating within a three-dimensional environment (e.g., aircraft pilots). Although vibrotactile displays can be applied to many parts of the body, recent applications have focused on torso-based displays that egocentrically reference distal targets. However, these displays may be poorly suited to convey elevation because of the generally cylindrical shape of the human torso. The purpose of the present study was to evaluate the relative effectiveness of handheld vibrotactile displays configured either in a cylindrical or spherical-shape as compared to a torso-based display. Due to its shape, the spherical display was predicted to facilitate superior elevation discernment; however, it was anticipated users must employ an object-centered reference point independent of the body when perceiving directionality via a handheld display. Hypothesis testing indicated participants\u27 perception of extrapersonal elevation was improved by the spherical handheld display. Evidence was not conclusive regarding participants use of an object-centered egocenter. The use of a handheld vibrotactile display resulted in increased subjective workload scores, regardless of shape. Results from the present study suggest a spherical handheld display may be advantageous for three-dimensional tasks; however, specific applications should be evaluated on a case-by-case basis

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Augmenting the Spatial Perception Capabilities of Users Who Are Blind

    Get PDF
    People who are blind face a series of challenges and limitations resulting from their lack of being able to see, forcing them to either seek the assistance of a sighted individual or work around the challenge by way of a inefficient adaptation (e.g. following the walls in a room in order to reach a door rather than walking in a straight line to the door). These challenges are directly related to blind users' lack of the spatial perception capabilities normally provided by the human vision system. In order to overcome these spatial perception related challenges, modern technologies can be used to convey spatial perception data through sensory substitution interfaces. This work is the culmination of several projects which address varying spatial perception problems for blind users. First we consider the development of non-visual natural user interfaces for interacting with large displays. This work explores the haptic interaction space in order to find useful and efficient haptic encodings for the spatial layout of items on large displays. Multiple interaction techniques are presented which build on prior research (Folmer et al. 2012), and the efficiency and usability of the most efficient of these encodings is evaluated with blind children. Next we evaluate the use of wearable technology in aiding navigation of blind individuals through large open spaces lacking tactile landmarks used during traditional white cane navigation. We explore the design of a computer vision application with an unobtrusive aural interface to minimize veering of the user while crossing a large open space. Together, these projects represent an exploration into the use of modern technology in augmenting the spatial perception capabilities of blind users

    A brain-computer interface with vibrotactile biofeedback for haptic information

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>It has been suggested that Brain-Computer Interfaces (BCI) may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only <it>vibrotactile feedback</it>, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy.</p> <p>Methods</p> <p>A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance.</p> <p>Results and Conclusion</p> <p>Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.</p

    Mechanical and psychophysical studies of surface wave propagation during vibrotactile stimulation

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2012.Cataloged from PDF version of thesis.Includes bibliographical references (p. 50-51).Vibrotactile displays are based on mechanical stimulation delivered using an array of motors to communicate with the user. The way in which the display's motors are spaced and positioned on the body can have a significant impact on the effectiveness of communication, especially for tactile displays used to convey spatial information. The objective of the present research was to determine how the surface waves induced by vibrotactile stimulation of the skin varied as a function of the site on the body where the motors were mounted, and how these waves influenced the ability to localize vibrotactile stimulation. Three locations on the body were selected for study: the palm, the forearm, and the thigh. A flexible printed circuit board containing 3-axis micro-accelerometers was fabricated to measure the amplitude and frequency of surface waves produced by a vibrating motor at each body site. Results of these experiments showed significant differences in the frequency and amplitude of vibration on the glabrous skin on the palm as compared to the hairy skin on the arm and thigh. The palm had the highest frequency and lowest amplitude surface waves, and the forearm and thigh were very similar with lower frequency higher amplitude surface waves. No anisotropies were found from surface wave measurements. Most wave attenuation occurred within the first 8 mm from the motor, but there were still detectable amplitudes at a distance of 24 mm from the motor, which suggests that motor spacing should be at least 24 mm for this type of motor when used for precise spatial localization. A series of psychophysical experiments was conducted using a three-by-three array of motors in which the ability of subjects to localize the point of stimulation in an array was determined at each of the three body locations. The results from these experiments indicated that the palm had the highest localization accuracy (81% correct) as compared to the forearm and thigh which had similar localization accuracies (49% correct on forearm, 45% correct on thigh). Accuracy on the palm and forearm improved when the motor spacing increased from 8 mm to 16 mm, but increased spacing did not improve accuracy on the thigh. The results also showed that subjects were more able to identify the column of activation as opposed to the row of activation, which suggests a higher spatial acuity along the mediallateral as opposed to proximal-distal axis. The localization experiments indicate that glabrous skin is better suited for precise spatial localization than hairy skin, and that precise spatial localization requires an inter-motor spacing of more than 16 mm at these sites.by Katherine O. Sofia.S.M
    corecore