1,147 research outputs found

    Identification of Haptic Based Guiding Using Hard Reins

    Get PDF
    This paper presents identifications of human-human interaction in which one person with limited auditory and visual perception of the environment (a follower) is guided by an agent with full perceptual capabilities (a guider) via a hard rein along a given path. We investigate several identifications of the interaction between the guider and the follower such as computational models that map states of the follower to actions of the guider and the computational basis of the guider to modulate the force on the rein in response to the trust level of the follower. Based on experimental identification systems on human demonstrations show that the guider and the follower experience learning for an optimal stable state-dependent novel 3rd and 2nd order auto-regressive predictive and reactive control policies respectively. By modeling the follower's dynamics using a time varying virtual damped inertial system, we found that the coefficient of virtual damping is most appropriate to explain the trust level of the follower at any given time. Moreover, we present the stability of the extracted guiding policy when it was implemented on a planar 1-DoF robotic arm. Our findings provide a theoretical basis to design advanced human-robot interaction algorithms applicable to a variety of situations where a human requires the assistance of a robot to perceive the environment

    Wearable haptic systems for the fingertip and the hand: taxonomy, review and perspectives

    Get PDF
    In the last decade, we have witnessed a drastic change in the form factor of audio and vision technologies, from heavy and grounded machines to lightweight devices that naturally fit our bodies. However, only recently, haptic systems have started to be designed with wearability in mind. The wearability of haptic systems enables novel forms of communication, cooperation, and integration between humans and machines. Wearable haptic interfaces are capable of communicating with the human wearers during their interaction with the environment they share, in a natural and yet private way. This paper presents a taxonomy and review of wearable haptic systems for the fingertip and the hand, focusing on those systems directly addressing wearability challenges. The paper also discusses the main technological and design challenges for the development of wearable haptic interfaces, and it reports on the future perspectives of the field. Finally, the paper includes two tables summarizing the characteristics and features of the most representative wearable haptic systems for the fingertip and the hand

    A framework for robotized teleoperated tasks

    Get PDF
    "Premio al mejor artículo presentado en ROBOT 2011" atorgat pel Grupo de Robótica, Visión y Control de la Universidad de Sevilla, la Universidad Pablo Olavide i el Centro Avanzado de Tecnologías Aeroespaciales.Teleoperation systems allow the extension of the human operator’s sensing and manipulative capability into a remote environment to perform tasks at a distance, but the time-delays in the communications affect the stability and transparency of such systems. This work presents a teleoperation framework in which some novel tools, such as nonlinear controllers, relational positioning techniques, haptic guiding and augmented reality, are used to increase the sensation of immersion of the human operator in the remote site. Experimental evidence supports the advantages of the proposed framework.Award-winningPostprint (published version

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Whole-Body Dynamic Telelocomotion: A Step-to-Step Dynamics Approach to Human Walking Reference Generation

    Full text link
    Teleoperated humanoid robots hold significant potential as physical avatars for humans in hazardous and inaccessible environments, with the goal of channeling human intelligence and sensorimotor skills through these robotic counterparts. Precise coordination between humans and robots is crucial for accomplishing whole-body behaviors involving locomotion and manipulation. To progress successfully, dynamic synchronization between humans and humanoid robots must be achieved. This work enhances advancements in whole-body dynamic telelocomotion, addressing challenges in robustness. By embedding the hybrid and underactuated nature of bipedal walking into a virtual human walking interface, we achieve dynamically consistent walking gait generation. Additionally, we integrate a reactive robot controller into a whole-body dynamic telelocomotion framework. Thus, allowing the realization of telelocomotion behaviors on the full-body dynamics of a bipedal robot. Real-time telelocomotion simulation experiments validate the effectiveness of our methods, demonstrating that a trained human pilot can dynamically synchronize with a simulated bipedal robot, achieving sustained locomotion, controlling walking speeds within the range of 0.0 m/s to 0.3 m/s, and enabling backward walking for distances of up to 2.0 m. This research contributes to advancing teleoperated humanoid robots and paves the way for future developments in synchronized locomotion between humans and bipedal robots.Comment: 8 pages, 8 figure

    Interactive molecular dynamics in virtual reality from quantum chemistry to drug binding: An open-source multi-person framework

    Get PDF
    © 2019 Author(s). As molecular scientists have made progress in their ability to engineer nanoscale molecular structure, we face new challenges in our ability to engineer molecular dynamics (MD) and flexibility. Dynamics at the molecular scale differs from the familiar mechanics of everyday objects because it involves a complicated, highly correlated, and three-dimensional many-body dynamical choreography which is often nonintuitive even for highly trained researchers. We recently described how interactive molecular dynamics in virtual reality (iMD-VR) can help to meet this challenge, enabling researchers to manipulate real-time MD simulations of flexible structures in 3D. In this article, we outline various efforts to extend immersive technologies to the molecular sciences, and we introduce "Narupa," a flexible, open-source, multiperson iMD-VR software framework which enables groups of researchers to simultaneously cohabit real-time simulation environments to interactively visualize and manipulate the dynamics of molecular structures with atomic-level precision. We outline several application domains where iMD-VR is facilitating research, communication, and creative approaches within the molecular sciences, including training machines to learn potential energy functions, biomolecular conformational sampling, protein-ligand binding, reaction discovery using "on-the-fly" quantum chemistry, and transport dynamics in materials. We touch on iMD-VR's various cognitive and perceptual affordances and outline how these provide research insight for molecular systems. By synergistically combining human spatial reasoning and design insight with computational automation, technologies such as iMD-VR have the potential to improve our ability to understand, engineer, and communicate microscopic dynamical behavior, offering the potential to usher in a new paradigm for engineering molecules and nano-architectures
    corecore