512 research outputs found

    Scalability of the Size of Patterns Drawn Using Tactile Hand Guidance

    Get PDF
    Haptic feedback for handwriting training has been extensively studied, but with primary focus on kinematic feedback. We provide vibrotactile feedback through a wrist worn sleeve to guide the user to recreate unknown patterns and study the impact of vibrational duration (1, 2, 3 seconds) on pattern scaling. User traces a line at 90° angles, while attempting to maintain a constant speed, in the direction of the motor activated till a different motor activation is perceived. Shape and size are two features of good letter formation. Study performed on three subjects showed the ability to utilize four vibrotactile motors to guide the hand towards correct shape formation with high accuracy (\u3e 95%). The overall size of the letter was observed to scale linearly with the vibrational duration. Implications for utilizing the vibrational feedback for handwriting correction are discussed

    Escaping the prison of singularity

    Get PDF
    Few human activities investigate the poverty or richness of human life or describe the mechanisms of ethical formation as fully and particularly as narratives do. In the development of our intellectual views and ethical stances, we cannot do without the guidance and examples of first-hand friends, acquaintances, and loved ones, but neither can we do without the second-hand guidance and examples of narrative friends and loved ones, for these latter supplement our need for sociability and help us fill out the education about the ways and means of being human that we receive from first-hand acquaintances. It would be wise of all of us who spend our lives construing theories about language and literature to remind ourselves daily at least-about the roots of our deep and inextinguishable need for narrative

    Quick-Glance and In-Depth exploration of a tabletop map for visually impaired people

    Get PDF
    National audienceInteractive tactile maps provide visually impaired people with accessible geographic information. However, when these maps are presented on large tabletops, tactile exploration without sight is long and tedious due to the size of the surface. In this paper we present a novel approach to speed up the process of exploring tabletop maps in the absence of vision. Our approach mimics the visual processing of a map and consists in two steps. First, the Quick-Glance step allows creating a global mental representation of the map by using mid-air gestures. Second, the In-Depth step allows users to reach Points of Interest with appropriate hand guidance onto the map. In this paper we present the design and development of a prototype combining a smartwatch and a tactile surface for Quick-Glance and In-Depth interactive exploration of a map

    Multimodal Mixed Reality Impact on a Hand Guiding Task with a Holographic Cobot

    Get PDF
    In the context of industrial production, a worker that wants to program a robot using the hand-guidance technique needs that the robot is available to be programmed and not in operation. This means that production with that robot is stopped during that time. A way around this constraint is to perform the same manual guidance steps on a holographic representation of the digital twin of the robot, using augmented reality technologies. However, this presents the limitation of a lack of tangibility of the visual holograms that the user tries to grab. We present an interface in which some of the tangibility is provided through ultrasound-based mid-air haptics actuation. We report a user study that evaluates the impact that the presence of such haptic feedback may have on a pick-and-place task of the wrist of a holographic robot arm which we found to be beneficial

    Using wrist vibrations to guide hand movement and whole body navigation

    Get PDF
    International audienceIn the absence of vision, mobility and orientation are challenging. Audio and tactile feedback can be used to guide visually impaired people. In this paper, we present two complementary studies on the use of vibrational cues for hand guidance during the exploration of itineraries on a map, and whole body-guidance in a virtual environment. Concretely, we designed wearable Arduino bracelets integrating a vibratory motor producing multiple patterns of pulses. In a first study, this bracelet was used for guiding the hand along unknown routes on an interactive tactile map. A wizard-of-Oz study with six blindfolded participants showed that tactons, vibrational patterns, may be more efficient than audio cues for indicating directions. In a second study, this bracelet was used by blindfolded participants to navigate in a virtual environment. The results presented here show that it is possible to significantly decrease travel distance with vibrational cues. To sum up, these preliminary but complementary studies suggest the interest of vibrational feedback in assistive technology for mobility and orientation for blind people

    WYFIWIF: A Haptic Communication Paradigm For Collaborative Motor Skills Learning

    Get PDF
    International audienceMotor skills transfer is a challenging issue for many applications such as surgery, design and industry. In order to design virtual environments that support motor skills learning, a deep understanding of humans' haptic interactions is required. To ensure skills transfer, experts and novices need to collaborate. This requires the construction of the common frame of reference between the teacher and the learner in order to understand each other. In this paper, human-human haptic collaboration is investigated in order to understand how haptic information is exchanged. Furthermore, WYFIWIF (What You Feel Is What I Feel), a haptic communication paradigm is introduced. This paradigm is based on a hand guidance metaphor. The paradigm helps operators to construct an efficient common frame of reference by allowing a direct haptic communication. A learning virtual environment is used to evaluate this haptic communication paradigm. Hence, 60 volunteer students performed a needle insertion learning task. The results of this experiment show that, compared to conventional methods, the learning method based on haptic communication improves the novices' performance in such a task. We conclude that the WYFIWIF paradigm facilitate expert-novice haptic collaboration to teach motor skills

    WYFIWIF: A Haptic Communication Paradigm For Collaborative Motor Skills Learning

    Get PDF
    International audienceMotor skills transfer is a challenging issue for many applications such as surgery, design and industry. In order to design virtual environments that support motor skills learning, a deep understanding of humans' haptic interactions is required. To ensure skills transfer, experts and novices need to collaborate. This requires the construction of the common frame of reference between the teacher and the learner in order to understand each other. In this paper, human-human haptic collaboration is investigated in order to understand how haptic information is exchanged. Furthermore, WYFIWIF (What You Feel Is What I Feel), a haptic communication paradigm is introduced. This paradigm is based on a hand guidance metaphor. The paradigm helps operators to construct an efficient common frame of reference by allowing a direct haptic communication. A learning virtual environment is used to evaluate this haptic communication paradigm. Hence, 60 volunteer students performed a needle insertion learning task. The results of this experiment show that, compared to conventional methods, the learning method based on haptic communication improves the novices' performance in such a task. We conclude that the WYFIWIF paradigm facilitate expert-novice haptic collaboration to teach motor skills

    The function of “looking-at-nothing” for sequential sensorimotor tasks: Eye movements to remembered action-target locations

    Get PDF
    When performing manual actions, eye movements precede hand movements to target locations: Before we grasp an object, we look at it. Eye-hand guidance is even preserved when visual targets are unavailable, e.g., grasping behind an occlusion. This “looking-at-nothing” behavior might be functional, e.g., as “deictic pointer” for manual control or as memory-retrieval cue, or a by-product of automatization. Here, it is studied if looking at empty locations before acting on them is beneficial for sensorimotor performance. In five experiments, participants completed a click sequence on eight visual targets for 0-100 trials while they had either to fixate on the screen center or could move their eyes freely. During 50-100 consecutive trials, participants clicked the same sequence on a blank screen with free or fixed gaze. During both phases, participants looked at target locations when gaze shifts were allowed. With visual targets, target fixations led to faster, more precise clicking, fewer errors, and sparser cursor-paths than central fixation. Without visual information, a tiny free-gaze benefit could sometimes be observed and was rather a memory than a motor-calculation benefit. Interestingly, central fixation during learning forced early explicit encoding causing a strong benefit for acting on remembered targets later, independent of whether eyes moved then
    • …
    corecore