208 research outputs found

    Novel Actuation Methods for High Force Haptics

    Get PDF

    Wearable Vibrotactile Haptic Device for Stiffness Discrimination during Virtual Interactions

    Get PDF
    In this paper, we discuss the development of cost effective, wireless, and wearable vibrotactile haptic device for stiffness perception during an interaction with virtual objects. Our experimental setup consists of haptic device with five vibrotactile actuators, virtual reality environment tailored in Unity 3D integrating the Oculus Rift Head Mounted Display (HMD) and the Leap Motion controller. The virtual environment is able to capture touch inputs from users. Interaction forces are then rendered at 500 Hz and fed back to the wearable setup stimulating fingertips with ERM vibrotactile actuators. Amplitude and frequency of vibrations are modulated proportionally to the interaction force to simulate the stiffness of a virtual object. A quantitative and qualitative study is done to compare the discrimination of stiffness on virtual linear spring in three sensory modalities: visual only feedback, tactile only feedback, and their combination. A common psychophysics method called the Two Alternative Forced Choice (2AFC) approach is used for quantitative analysis using Just Noticeable Difference (JND) and Weber Fractions (WF). According to the psychometric experiment result, average Weber fraction values of 0.39 for visual only feedback was improved to 0.25 by adding the tactile feedback

    Docking Haptics: Extending the Reach of Haptics by Dynamic Combinations of Grounded and Worn Devices

    Full text link
    Grounded haptic devices can provide a variety of forces but have limited working volumes. Wearable haptic devices operate over a large volume but are relatively restricted in the types of stimuli they can generate. We propose the concept of docking haptics, in which different types of haptic devices are dynamically docked at run time. This creates a hybrid system, where the potential feedback depends on the user's location. We show a prototype docking haptic workspace, combining a grounded six degree-of-freedom force feedback arm with a hand exoskeleton. We are able to create the sensation of weight on the hand when it is within reach of the grounded device, but away from the grounded device, hand-referenced force feedback is still available. A user study demonstrates that users can successfully discriminate weight when using docking haptics, but not with the exoskeleton alone. Such hybrid systems would be able to change configuration further, for example docking two grounded devices to a hand in order to deliver twice the force, or extend the working volume. We suggest that the docking haptics concept can thus extend the practical utility of haptics in user interfaces

    Electroadhesive Auxetics as Programmable Layer Jamming Skins for Formable Crust Shape Displays

    Full text link
    Shape displays are a class of haptic devices that enable whole-hand haptic exploration of 3D surfaces. However, their scalability is limited by the mechanical complexity and high cost of traditional actuator arrays. In this paper, we propose using electroadhesive auxetic skins as a strain-limiting layer to create programmable shape change in a continuous ("formable crust") shape display. Auxetic skins are manufactured as flexible printed circuit boards with dielectric-laminated electrodes on each auxetic unit cell (AUC), using monolithic fabrication to lower cost and assembly time. By layering multiple sheets and applying a voltage between electrodes on subsequent layers, electroadhesion locks individual AUCs, achieving a maximum in-plane stiffness variation of 7.6x with a power consumption of 50 uW/AUC. We first characterize an individual AUC and compare results to a kinematic model. We then validate the ability of a 5x5 AUC array to actively modify its own axial and transverse stiffness. Finally, we demonstrate this array in a continuous shape display as a strain-limiting skin to programmatically modulate the shape output of an inflatable LDPE pouch. Integrating electroadhesion with auxetics enables new capabilities for scalable, low-profile, and low-power control of flexible robotic systems.Comment: Accepted to IEEE International Conference on Robotics and Automation (ICRA 2023

    Multimodal Human-Machine Interface For Haptic-Controlled Excavators

    Get PDF
    The goal of this research is to develop a human-excavator interface for the hapticcontrolled excavator that makes use of the multiple human sensing modalities (visual, auditory haptic), and efficiently integrates these modalities to ensure intuitive, efficient interface that is easy to learn and use, and is responsive to operator commands. Two empirical studies were conducted to investigate conflict in the haptic-controlled excavator interface and identify the level of force feedback for best operator performance
    corecore