2 research outputs found

    Development and assessment of a contactless 3D joystick approach to industrial manipulator gesture control

    Get PDF
    This paper explores a novel design of ergonomic gesture control with visual feedback for the UR3 collaborative robot that aims to allow users with little to no familiarity with robots to complete basic tasks and programming. The principle behind the design mirrors that of a 3D joystick but utilises the Leapmotion device to track the user's hands and prevents any need for a physical joystick or buttons. The Rapid Upper Limb Assessment (RULA) ergonomic tool was used to inform the design and ensure the system was safe for long-term use. The developed system was assessed using the RULA tool for an ergonomic score and through an experiment requiring 19 voluntary participants to complete a basic task with both the gesture system and the UR3's RTP (Robot Teach Pendant), then filling out SUS (System Usability Scale) questionnaires to compare the usability of both systems. The task involved controlling the robot to pick up a pipe and then insert it into a series of slots of decreasing diameter, allowing for both the speed and accuracy of each system to be compared. The experiment found that even those with no previous robot experience were able to complete the tasks after only a brief description of how the gesture system works. Despite beating the RTP's ergonomic score, the system narrowly lost on average usability scores. However, as a contactless gesture system it has other advantages over the RTP and through this experiment many potential improvements were identified, paving the way for future work into assessing the significance of including the visual feedback and comparing this system against other gesture-based systems

    Expressive feedback from virtual buttons

    Get PDF
    The simple action of pressing a button is a multimodal interaction with an interesting depth of complexity. As the development of computer interfaces to support 3D tasks evolves, there is a need to better understand how users will interact with virtual buttons that generate feedback from multiple sensory modalities. This research examined the effects of visual, auditory, and haptic feedback from virtual buttons on task performance dialing phone numbers and on the motion of individual buttons presses. This research also presents a theoretical framework for virtual button feedback and a model of virtual button feedback that includes touch feedback hysteresis. The results suggest that although haptic feedback alone was not enough to prevent participants from pressing the button farther than necessary, bimodal and trimodal feedback combinations that included haptic feedback shortened the depth of the presses. However, the shallower presses observed during trimodal feedback may have led to a counterintuitive increase in the number of digits that the participants omitted during the task. Even though interaction with virtual buttons may appear simple, it is important to understand the complexities behind the multimodal interaction because users will seek out the multimodal interactions they prefer
    corecore