2,985 research outputs found

    On the Collaboration of an Automatic Path-Planner and a Human User for Path-Finding in Virtual Industrial Scenes

    Get PDF
    This paper describes a global interactive framework enabling an automatic path-planner and a user to collaborate for finding a path in cluttered virtual environments. First, a collaborative architecture including the user and the planner is described. Then, for real time purpose, a motion planner divided into different steps is presented. First, a preliminary workspace discretization is done without time limitations at the beginning of the simulation. Then, using these pre-computed data, a second algorithm finds a collision free path in real time. Once the path is found, an haptic artificial guidance on the path is provided to the user. The user can then influence the planner by not following the path and automatically order a new path research. The performances are measured on tests based on assembly simulation in CAD scenes

    Learning to Navigate Cloth using Haptics

    Full text link
    We present a controller that allows an arm-like manipulator to navigate deformable cloth garments in simulation through the use of haptic information. The main challenge of such a controller is to avoid getting tangled in, tearing or punching through the deforming cloth. Our controller aggregates force information from a number of haptic-sensing spheres all along the manipulator for guidance. Based on haptic forces, each individual sphere updates its target location, and the conflicts that arise between this set of desired positions is resolved by solving an inverse kinematic problem with constraints. Reinforcement learning is used to train the controller for a single haptic-sensing sphere, where a training run is terminated (and thus penalized) when large forces are detected due to contact between the sphere and a simplified model of the cloth. In simulation, we demonstrate successful navigation of a robotic arm through a variety of garments, including an isolated sleeve, a jacket, a shirt, and shorts. Our controller out-performs two baseline controllers: one without haptics and another that was trained based on large forces between the sphere and cloth, but without early termination.Comment: Supplementary video available at https://youtu.be/iHqwZPKVd4A. Related publications http://www.cc.gatech.edu/~karenliu/Robotic_dressing.htm

    Virtual and Mixed Reality in Telerobotics: A Survey

    Get PDF

    Toward future 'mixed reality' learning spaces for STEAM education

    Get PDF
    Digital technology is becoming more integrated and part of modern society. As this begins to happen, technologies including augmented reality, virtual reality, 3d printing and user supplied mobile devices (collectively referred to as mixed reality) are often being touted as likely to become more a part of the classroom and learning environment. In the discipline areas of STEAM education, experts are expected to be at the forefront of technology and how it might fit into their classroom. This is especially important because increasingly, educators are finding themselves surrounded by new learners that expect to be engaged with participatory, interactive, sensory-rich, experimental activities with greater opportunities for student input and creativity. This paper will explore learner and academic perspectives on mixed reality case studies in 3d spatial design (multimedia and architecture), paramedic science and information technology, through the use of existing data as well as additional one-on-one interviews around the use of mixed reality in the classroom. Results show that mixed reality can provide engagement, critical thinking and problem solving benefits for students in line with this new generation of learners, but also demonstrates that more work needs to be done to refine mixed reality solutions for the classroom

    DandelionTouch: High Fidelity Haptic Rendering of Soft Objects in VR by a Swarm of Drones

    Full text link
    To achieve high fidelity haptic rendering of soft objects in a high mobility virtual environment, we propose a novel haptic display DandelionTouch. The tactile actuators are delivered to the fingertips of the user by a swarm of drones. Users of DandelionTouch are capable of experiencing tactile feedback in a large space that is not limited by the device's working area. Importantly, they will not experience muscle fatigue during long interactions with virtual objects. Hand tracking and swarm control algorithm allow guiding the swarm with hand motions and avoid collisions inside the formation. Several topologies of the impedance connection between swarm units were investigated in this research. The experiment, in which drones performed a point following task on a square trajectory in real-time, revealed that drones connected in a Star topology performed the trajectory with low mean positional error (RMSE decreased by 20.6% in comparison with other impedance topologies and by 40.9% in comparison with potential field-based swarm control). The achieved velocities of the drones in all formations with impedance behavior were 28% higher than for the swarm controlled with the potential field algorithm. Additionally, the perception of several vibrotactile patterns was evaluated in a user study with 7 participants. The study has shown that the proposed combination of temporal delay and frequency modulation allows users to successfully recognize the surface property and motion direction in VR simultaneously (mean recognition rate of 70%, maximum of 93%). DandelionTouch suggests a new type of haptic feedback in VR systems where no hand-held or wearable interface is required.Comment: Accepted to the 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC). Copyright 20XX IEEE. Personal use of this material is permitte
    corecore