156 research outputs found

    Human-human haptic collaboration in cyclical Fitts' tasks

    Get PDF
    Understanding how humans assist each other in haptic interaction teams could lead to improved robotic aids to solo human dextrous manipulation. Inspired by experiments reported in Reed et al. (2004), which suggested two-person haptically interacting teams could achieve a lower movement time (MT) than individuals for discrete aiming movements of specified accuracy, we report that two-person teams (dyads) can also achieve lower MT for cyclical, continuous aiming movements. We propose a model, called endpoint compromise, for how the intended endpoints of both subjects' motion combine during haptic interaction; it predicts a ratio of /spl radic/2 between slopes of MT fits for individuals and dyads. This slope ratio prediction is supported by our data

    Evaluation of Haptic and Visual Cues for Repulsive or Attractive Guidance in Nonholonomic Steering Tasks.

    Get PDF
    Remote control of vehicles is a difficult task for operators. Support systems that present additional task information may assist operators, but their usefulness is expected to depend on several factors such as 1) the nature of conveyed information, 2) what modality it is conveyed through, and 3) the task difficulty. In an exploratory experiment, these three factors were manipulated to quantify their effects on operator behavior. Subjects ( n=15n = {{15}}) used a haptic manipulator to steer a virtual nonholonomic vehicle through abstract environments, in which obstacles needed to be avoided. Both a simple support conveying near-future predictions of the trajectory of the vehicle and a more elaborate support that continuously suggests the path to be taken were designed (factor 1). These types of information were offered either with visual or haptic cues (factor 2). These four support systems were tested in four different abstracted environments with decreasing amount of allowed variability in realized trajectories (factor 3). The results show improvements for the simple support only when this information was presented visually, but not when offered haptically. For the elaborate support, equally large improvements for both modalities were found. This suggests that the elaborate support is better: additional information is key in improving performance in nonholonomic steering tasks

    Robots Taking Initiative in Collaborative Object Manipulation: Lessons from Physical Human-Human Interaction

    Full text link
    Physical Human-Human Interaction (pHHI) involves the use of multiple sensory modalities. Studies of communication through spoken utterances and gestures are well established. Nevertheless, communication through force signals is not well understood. In this paper, we focus on investigating the mechanisms employed by humans during the negotiation through force signals, which is an integral part of successful collaboration. Our objective is to use the insights to inform the design of controllers for robot assistants. Specifically, we want to enable robots to take the lead in collaboration. To achieve this goal, we conducted a study to observe how humans behave during collaborative manipulation tasks. During our preliminary data analysis, we discovered several new features that help us better understand how the interaction progresses. From these features, we identified distinct patterns in the data that indicate when a participant is expressing their intent. Our study provides valuable insight into how humans collaborate physically, which can help us design robots that behave more like humans in such scenarios

    Model-Augmented Haptic Telemanipulation: Concept, Retrospective Overview, and Current Use Cases

    Get PDF
    Certain telerobotic applications, including telerobotics in space, pose particularly demanding challenges to both technology and humans. Traditional bilateral telemanipulation approaches often cannot be used in such applications due to technical and physical limitations such as long and varying delays, packet loss, and limited bandwidth, as well as high reliability, precision, and task duration requirements. In order to close this gap, we research model-augmented haptic telemanipulation (MATM) that uses two kinds of models: a remote model that enables shared autonomous functionality of the teleoperated robot, and a local model that aims to generate assistive augmented haptic feedback for the human operator. Several technological methods that form the backbone of the MATM approach have already been successfully demonstrated in accomplished telerobotic space missions. On this basis, we have applied our approach in more recent research to applications in the fields of orbital robotics, telesurgery, caregiving, and telenavigation. In the course of this work, we have advanced specific aspects of the approach that were of particular importance for each respective application, especially shared autonomy, and haptic augmentation. This overview paper discusses the MATM approach in detail, presents the latest research results of the various technologies encompassed within this approach, provides a retrospective of DLR's telerobotic space missions, demonstrates the broad application potential of MATM based on the aforementioned use cases, and outlines lessons learned and open challenges

    A Virtual Reality Platform for Musical Creation

    No full text
    International audienceVirtual reality aims at interacting with a computer in a similar form to interacting with an object of the real world. This research presents a VR platform allowing the user (1) to interactively create physically-based musical instruments and sounding objects, (2) play them in real time by using multisensory interaction by ways of haptics, 3D visualisation during playing, and real time physically-based sound synthesis. So doing, our system presents the two main properties expected in VR systems: the possibility of designing any type of objects and manipulating them in a multisensory real time fashion. By presenting our environment, we discuss the scientific underlying questions: (1) concerning the real time simulation, the way to manage simultaneous audio-haptic-visual cooperation during the real time multisensory simulations and (2) the Computer Aided Design functionalities for the creation of new physically-based musical instruments and sounding objects

    The need for combining implicit and explicit communication in cooperative robotic systems

    Get PDF
    As the number of robots used in warehouses and manufacturing increases, so too does the need for robots to be able to manipulate objects, not only independently, but also in collaboration with humans and other robots. Our ability to effectively coordinate our actions with fellow humans encompasses several behaviours that are collectively referred to as joint action, and has inspired advances in human-robot interaction by leveraging our natural ability to interpret implicit cues. However, our capacity to efficiently coordinate on object manipulation tasks remains an advantageous process that is yet to be fully exploited in robotic applications. Humans achieve this form of coordination by combining implicit communication (where information is inferred) and explicit communication (direct communication through an established channel) in varying degrees according to the task at hand. Although these two forms of communication have previously been implemented in robotic systems, no system exists that integrates the two in a task-dependent adaptive manner. In this paper, we review existing work on joint action in human-robot interaction, and analyse the state-of-the-art in robot-robot interaction that could act as a foundation for future cooperative object manipulation approaches. We identify key mechanisms that must be developed in order for robots to collaborate more effectively, with other robots and humans, on object manipulation tasks in shared autonomy spaces

    Virtual reality training for micro-robotic cell injection

    Full text link
    This research was carried out to fill the gap within existing knowledge on the approaches to supplement the training for micro-robotic cell injection procedure by utilising virtual reality and haptic technologies

    Finally! Insights into the ARCHES Lunar Planetary Exploration Analogue Campaign on Etna in summer 2022

    Get PDF
    This paper summarises the first outcomes of the space demonstration mission of the ARCHES project which could have been performed this year from 13 june until 10 july on Italy’s Mt. Etna in Sicily. After the second postponement related to COVID from the initially for 2020 planed campaign, we are now very happy to report, that the whole campaign with more than 65 participants for four weeks has been successfully conduced. In this short overview paper, we will refer to all other publication here on IAC22. This paper includes an overview of the performed 4-week campaign and the achieved mission goals and first results but also share our findings on the organisational and planning aspects

    Interpersonal Synergies

    Get PDF
    We present the perspective that interpersonal movement coordination results from establishing interpersonal synergies. Interpersonal synergies are higher-order control systems formed by coupling movement system degrees of freedom of two (or more) actors. Characteristic features of synergies identified in studies of intrapersonal coordination – dimensional compression and reciprocal compensation – are revealed in studies of interpersonal coordination that applied the uncontrolled manifold approach and principal component analysis to interpersonal movement tasks. Broader implications of the interpersonal synergy approach for movement science include an expanded notion of mechanism and an emphasis on interaction-dominant dynamics

    Development and Biodynamic Simulation of a Detailed Musculo-Skeletal Spine Model

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH
    corecore