424 research outputs found

    Tactile-STAR: A Novel Tactile STimulator And Recorder System for Evaluating and Improving Tactile Perception

    Get PDF
    Many neurological diseases impair the motor and somatosensory systems. While several different technologies are used in clinical practice to assess and improve motor functions, somatosensation is evaluated subjectively with qualitative clinical scales. Treatment of somatosensory deficits has received limited attention. To bridge the gap between the assessment and training of motor vs. somatosensory abilities, we designed, developed, and tested a novel, low-cost, two-component (bimanual) mechatronic system targeting tactile somatosensation: the Tactile-STAR—a tactile stimulator and recorder. The stimulator is an actuated pantograph structure driven by two servomotors, with an end-effector covered by a rubber material that can apply two different types of skin stimulation: brush and stretch. The stimulator has a modular design, and can be used to test the tactile perception in different parts of the body such as the hand, arm, leg, big toe, etc. The recorder is a passive pantograph that can measure hand motion using two potentiometers. The recorder can serve multiple purposes: participants can move its handle to match the direction and amplitude of the tactile stimulator, or they can use it as a master manipulator to control the tactile stimulator as a slave. Our ultimate goal is to assess and affect tactile acuity and somatosensory deficits. To demonstrate the feasibility of our novel system, we tested the Tactile-STAR with 16 healthy individuals and with three stroke survivors using the skin-brush stimulation. We verified that the system enables the mapping of tactile perception on the hand in both populations. We also tested the extent to which 30 min of training in healthy individuals led to an improvement of tactile perception. The results provide a first demonstration of the ability of this new system to characterize tactile perception in healthy individuals, as well as a quantification of the magnitude and pattern of tactile impairment in a small cohort of stroke survivors. The finding that short-term training with Tactile-STARcan improve the acuity of tactile perception in healthy individuals suggests that Tactile-STAR may have utility as a therapeutic intervention for somatosensory deficits

    Tactile-STAR: A novel tactile STimulator And Recorder system for evaluating and improving tactile perception

    Get PDF
    Many neurological diseases impair the motor and somatosensory systems. While several different technologies are used in clinical practice to assess and improve motor functions, somatosensation is evaluated subjectively with qualitative clinical scales. Treatment of somatosensory deficits has received limited attention. To bridge the gap between the assessment and training of motor vs. somatosensory abilities, we designed, developed, and tested a novel, low-cost, two-component (bimanual) mechatronic system targeting tactile somatosensation: the Tactile-STAR\u2014a tactile stimulator and recorder. The stimulator is an actuated pantograph structure driven by two servomotors, with an end-effector covered by a rubber material that can apply two different types of skin stimulation: brush and stretch. The stimulator has a modular design, and can be used to test the tactile perception in different parts of the body such as the hand, arm, leg, big toe, etc. The recorder is a passive pantograph that can measure hand motion using two potentiometers. The recorder can serve multiple purposes: participants can move its handle to match the direction and amplitude of the tactile stimulator, or they can use it as a master manipulator to control the tactile stimulator as a slave. Our ultimate goal is to assess and affect tactile acuity and somatosensory deficits. To demonstrate the feasibility of our novel system, we tested the Tactile-STAR with 16 healthy individuals and with three stroke survivors using the skin-brush stimulation. We verified that the system enables the mapping of tactile perception on the hand in both populations. We also tested the extent to which 30 min of training in healthy individuals led to an improvement of tactile perception. The results provide a first demonstration of the ability of this new system to characterize tactile perception in healthy individuals, as well as a quantification of the magnitude and pattern of tactile impairment in a small cohort of stroke survivors. The finding that short-term training with Tactile-STAR can improve the acuity of tactile perception in healthy individuals suggests that Tactile-STAR may have utility as a therapeutic intervention for somatosensory deficits

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    A Haptic Study to Inclusively Aid Teaching and Learning in the Discipline of Design

    Get PDF
    Designers are known to use a blend of manual and virtual processes to produce design prototype solutions. For modern designers, computer-aided design (CAD) tools are an essential requirement to begin to develop design concept solutions. CAD, together with augmented reality (AR) systems have altered the face of design practice, as witnessed by the way a designer can now change a 3D concept shape, form, color, pattern, and texture of a product by the click of a button in minutes, rather than the classic approach to labor on a physical model in the studio for hours. However, often CAD can limit a designer’s experience of being ‘hands-on’ with materials and processes. The rise of machine haptic1 (MH) tools have afforded a great potential for designers to feel more ‘hands-on’ with the virtual modeling processes. Through the use of MH, product designers are able to control, virtually sculpt, and manipulate virtual 3D objects on-screen. Design practitioners are well placed to make use of haptics, to augment 3D concept creation which is traditionally a highly tactile process. For similar reasoning, it could also be said that, non-sighted and visually impaired (NS, VI) communities could also benefit from using MH tools to increase touch-based interactions, thereby creating better access for NS, VI designers. In spite of this the use of MH within the design industry (specifically product design), or for use by the non-sighted community is still in its infancy. Therefore the full benefit of haptics to aid non-sighted designers has not yet been fully realised. This thesis empirically investigates the use of multimodal MH as a step closer to improving the virtual hands-on process, for the benefit of NS, VI and fully sighted (FS) Designer-Makers. This thesis comprises four experiments, embedded within four case studies (CS1-4). Case study 1and2 worked with self-employed NS, VI Art Makers at Henshaws College for the Blind and Visual Impaired. The study examined the effects of haptics on NS, VI users, evaluations of experience. Case study 3 and4, featuring experiments 3 and4, have been designed to examine the effects of haptics on distance learning design students at the Open University. The empirical results from all four case studies showed that NS, VI users were able to navigate and perceive virtual objects via the force from the haptically rendered objects on-screen. Moreover, they were assisted by the whole multimodal MH assistance, which in CS2 appeared to offer better assistance to NS versus FS participants. In CS3 and 4 MH and multimodal assistance afforded equal assistance to NS, VI, and FS, but haptics were not as successful in bettering the time results recorded in manual (M) haptic conditions. However, the collision data between M and MH showed little statistical difference. The thesis showed that multimodal MH systems, specifically used in kinesthetic mode have enabled human (non-disabled and disabled) to credibly judge objects within the virtual realm. It also shows that multimodal augmented tooling can improve the interaction and afford better access to the graphical user interface for a wider body of users

    Tactile Weight Rendering: A Review for Researchers and Developers

    Full text link
    Haptic rendering of weight plays an essential role in naturalistic object interaction in virtual environments. While kinesthetic devices have traditionally been used for this aim by applying forces on the limbs, tactile interfaces acting on the skin have recently offered potential solutions to enhance or substitute kinesthetic ones. Here, we aim to provide an in-depth overview and comparison of existing tactile weight rendering approaches. We categorized these approaches based on their type of stimulation into asymmetric vibration and skin stretch, further divided according to the working mechanism of the devices. Then, we compared these approaches using various criteria, including physical, mechanical, and perceptual characteristics of the reported devices and their potential applications. We found that asymmetric vibration devices have the smallest form factor, while skin stretch devices relying on the motion of flat surfaces, belts, or tactors present numerous mechanical and perceptual advantages for scenarios requiring more accurate weight rendering. Finally, we discussed the selection of the proposed categorization of devices and their application scopes, together with the limitations and opportunities for future research. We hope this study guides the development and use of tactile interfaces to achieve a more naturalistic object interaction and manipulation in virtual environments.Comment: 15 pages, 2 tables, 3 figures, surve

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications

    Principles and Guidelines for Advancement of Touchscreen-Based Non-visual Access to 2D Spatial Information

    Get PDF
    Graphical materials such as graphs and maps are often inaccessible to millions of blind and visually-impaired (BVI) people, which negatively impacts their educational prospects, ability to travel, and vocational opportunities. To address this longstanding issue, a three-phase research program was conducted that builds on and extends previous work establishing touchscreen-based haptic cuing as a viable alternative for conveying digital graphics to BVI users. Although promising, this approach poses unique challenges that can only be addressed by schematizing the underlying graphical information based on perceptual and spatio-cognitive characteristics pertinent to touchscreen-based haptic access. Towards this end, this dissertation empirically identified a set of design parameters and guidelines through a logical progression of seven experiments. Phase I investigated perceptual characteristics related to touchscreen-based graphical access using vibrotactile stimuli, with results establishing three core perceptual guidelines: (1) a minimum line width of 1mm should be maintained for accurate line-detection (Exp-1), (2) a minimum interline gap of 4mm should be used for accurate discrimination of parallel vibrotactile lines (Exp-2), and (3) a minimum angular separation of 4mm should be used for accurate discrimination of oriented vibrotactile lines (Exp-3). Building on these parameters, Phase II studied the core spatio-cognitive characteristics pertinent to touchscreen-based non-visual learning of graphical information, with results leading to the specification of three design guidelines: (1) a minimum width of 4mm should be used for supporting tasks that require tracing of vibrotactile lines and judging their orientation (Exp-4), (2) a minimum width of 4mm should be maintained for accurate line tracing and learning of complex spatial path patterns (Exp-5), and (3) vibrotactile feedback should be used as a guiding cue to support the most accurate line tracing performance (Exp-6). Finally, Phase III demonstrated that schematizing line-based maps based on these design guidelines leads to development of an accurate cognitive map. Results from Experiment-7 provide theoretical evidence in support of learning from vision and touch as leading to the development of functionally equivalent amodal spatial representations in memory. Findings from all seven experiments contribute to new theories of haptic information processing that can guide the development of new touchscreen-based non-visual graphical access solutions

    Enhancing Situational Awareness Through Haptics Interaction In Virtual Environment Training Systmes

    Get PDF
    Virtual environment (VE) technology offers a viable training option for developing knowledge, skills and attitudes (KSA) within domains that have limited live training opportunities due to personnel safety and cost (e.g., live fire exercises). However, to ensure these VE training systems provide effective training and transfer, designers of such systems must ensure that training goals and objectives are clearly defined and VEs are designed to support development of KSAs required. Perhaps the greatest benefit of VE training is its ability to provide a multimodal training experience, where trainees can see, hear and feel their surrounding environment, thus engaging them in training scenarios to further their expertise. This work focused on enhancing situation awareness (SA) within a training VE through appropriate use of multimodal cues. The Multimodal Optimization of Situation Awareness (MOSA) model was developed to identify theoretical benefits of various environmental and individual multimodal cues on SA components. Specific focus was on benefits associated with adding cues that activated the haptic system (i.e., kinesthetic/cutaneous sensory systems) or vestibular system in a VE. An empirical study was completed to evaluate the effectiveness of adding two independent spatialized tactile cues to a Military Operations on Urbanized Terrain (MOUT) VE training system, and how head tracking (i.e., addition of rotational vestibular cues) impacted spatial awareness and performance when tactile cues were added during training. Results showed tactile cues enhanced spatial awareness and performance during both repeated training and within a transfer environment, yet there were costs associated with including two cues together during training, as each cue focused attention on a different aspect of the global task. In addition, the results suggest that spatial awareness benefits from a single point indicator (i.e., spatialized tactile cues) may be impacted by interaction mode, as performance benefits were seen when tactile cues were paired with head tracking. Future research should further examine theoretical benefits outlined in the MOSA model, and further validate that benefits can be realized through appropriate activation of multimodal cues for targeted training objectives during training, near transfer and far transfer (i.e., real world performance)
    • …
    corecore