2,990 research outputs found

    Haptic negotiation and role exchange for collaboration in virtual environments

    Get PDF
    We investigate how collaborative guidance can be realized in multi-modal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates to help a user execute a task in a faster, more accurate, and subjectively more pleasing fashion. In particular, we are interested in determining guidance mechanisms that best facilitate task performance and arouse a natural sense of collaboration. We suggest that a haptic guidance system can be further improved if it is supplemented with a role exchange mechanism, which allows the computer to adjust the forces it applies to the user in response to his/her actions. Recent work on collaboration and role exchange presented new perspectives on defining roles and interaction. However existing approaches mainly focus on relatively basic environments where the state of the system can be defined with a few parameters. We designed and implemented a complex and highly dynamic multimodal game for testing our interaction model. Since the state space of our application is complex, role exchange needs to be implemented carefully. We defined a novel negotiation process, which facilitates dynamic communication between the user and the computer, and realizes the exchange of roles using a three-state finite state machine. Our preliminary results indicate that even though the negotiation and role exchange mechanism we adopted does not improve performance in every evaluation criteria, it introduces a more personal and human-like interaction model

    Intention recognition for dynamic role exchange in haptic collaboration

    No full text
    In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the users sense of interaction and reinforce his/her belief that the computer aids with the execution of the task. © 2013 IEEE

    Haptic-Guided Teleoperation of a 7-DoF Collaborative Robot Arm With an Identical Twin Master

    Get PDF
    In this article, we describe two techniques to enable haptic-guided teleoperation using 7-DoF cobot arms as master and slave devices. A shortcoming of using cobots as master-slave systems is the lack of force feedback at the master side. However, recent developments in cobot technologies have brought in affordable, flexible, and safe torque-controlled robot arms, which can be programmed to generate force feedback to mimic the operation of a haptic device. In this article, we use two Franka Emika Panda robot arms as a twin master-slave system to enable haptic-guided teleoperation. We propose a two layer mechanism to implement force feedback due to 1) object interactions in the slave workspace, and 2) virtual forces, e.g. those that can repel from static obstacles in the remote environment or provide task-related guidance forces. We present two different approaches for force rendering and conduct an experimental study to evaluate the performance and usability of these approaches in comparison to teleoperation without haptic guidance. Our results indicate that the proposed joint torque coupling method for rendering task forces improves energy requirements during haptic guided telemanipulation, providing realistic force feedback by accurately matching the slave torque readings at the master side

    Conveying intentions through haptics in human-computer collaboration

    Get PDF
    Haptics has been used as a natural way for humans to communicate with computers in collaborative virtual environments. Human-computer collaboration is typically achieved by sharing control of the task between a human and a computer operator. An important research challenge in the field addresses the need to realize intention recognition and response, which involves a decision making process between the partners. In an earlier study, we implemented a dynamic role exchange mechanism, which realizes decision making by means of trading the parties' control levels on the task. This mechanism proved to show promise of a more intuitive and comfortable communication. Here, we extend our earlier work to further investigate the utility of a role exchange mechanism in dynamic collaboration tasks. An experiment with 30 participants was conducted to compare the utility of a role exchange mechanism with that of a shared control scheme where the human and the computer share control equally at all times. A no guidance condition is considered as a base case to present the benefits of these two guidance schemes more clearly. Our experiment show that the role exchange scheme maximizes the efficiency of the user, which is the ratio of the work done by the user within the task to the energy spent by her. Furthermore, we explored the added benefits of explicitly displaying the control state by embedding visual and vibrotactile sensory cues on top of the role exchange scheme. We observed that such cues decrease performance slightly, probably because they introduce an extra cognitive load, yet they improve the users' sense of collaboration and interaction with the computer. These cues also create a stronger sense of trust for the user towards her partner's control over the task

    Material Sight: A Sensorium for Fundamental Physics

    Get PDF
    Often our attempts to connect to the spatial and temporal scales of fundamental physics - from the subatomic to the multiverse - provoke a form of perceptual vertigo, especially for non-scientists. When we approach ideas of paralysing abstraction through the perceptual range of our sensing bodies, a ‘phenomenological dissonance’ can be said to be invoked, between material presence and radical remoteness. This relational dynamic, between materiality and remoteness, formed the conceptual springboard for 'Material Sight' (2016-2018), a research project based at three world-leading facilities for fundamental physics, that brought to fruition a body of photographic objects, film works and immersive soundscape that re-presented the spaces of fundamental physics as sites of material encounter. The research was premised on a paradoxical desire to create a sensorium for fundamental physics, asking if photography, film and sound can embody the spaces of experimental science and present them back to scientists and non-scientists alike, not as illustrations of the technical sublime but as sites of phenomenological encounter. This article plots the key conceptual coordinates of 'Material Sight' and looks at how the project’s methodological design – essentially the production of knowledge through the 'act of looking' – emphatically resisted the gravitational pull of art to be instrumentalised as an illustrative device within scientific contexts

    Haptic role allocation and intention negotiation in human-robot collaboration

    Get PDF
    This dissertation aims to present a perspective to build more natural shared control systems for physical human-robot cooperation. As the tasks become more complex and more dynamic, many shared control schemes fail to meet the expectation of an effortless interaction that resembles human-human sensory communication. Since such systems are mainly built to improve task performance, the richness of sensory communication is of secondary concern. We suggest that effective cooperation can be achieved when the human’s and the robot’s roles within the task are dynamically updated during the execution of the task. These roles define states for the system, in which the robot’s control leads or follows the human’s actions. In such a system, a state transition can occur at certain times if the robot can determine the user’s intention for gaining/relinquishing control. Specifically, with these state transitions we assign certain roles to the human and the robot. We believe that only by employing the robot with tools to change its behavior during collaboration, we can improve the collaboration experience. We explore how human-robot cooperation in virtual and physical worlds can be improved using a force-based role-exchange mechanism. Our findings indicate that the proposed role exchange framework is beneficial in a sense that it can improve task performance and the efficiency of the partners during the task, and decrease the energy requirement of the human. Moreover, the results imply that the subjective acceptability of the proposed model is attained only when role exchanges are performed in a smooth and transparent fashion. Finally, we illustrate that adding extra sensory cues on top of a role exchange scheme is useful for improving the sense of interaction during the task, as well as making the system more comfortable and easier to use, and the task more enjoyable

    Decision-making model for adaptive impedance control of teleoperation systems

    Get PDF
    © 2008-2011 IEEE. This paper presents a haptic assistance strategy for teleoperation that makes a task and situation-specific compromise between improving tracking performance or human-machine interaction in partially structured environments via the scheduling of the parameters of an admittance controller. The proposed assistance strategy builds on decision-making models and combines one of them with impedance control techniques that are standard in bilateral teleoperation systems. Even though several decision-making models have been proposed in cognitive science, their application to assisted teleoperation and assisted robotics has hardly been explored yet. Experimental data supports the Drift-Diffusion model as a suitable scheduling strategy for haptic shared control, in which the assistance mechanism can be adapted via the parameters of reward functions. Guidelines to tune the decision making model are presented. The influence of the reward structure on the realized haptic assistances is evaluated in a user study and results are compared to the no assistance and human assistance case

    Robots Taking Initiative in Collaborative Object Manipulation: Lessons from Physical Human-Human Interaction

    Full text link
    Physical Human-Human Interaction (pHHI) involves the use of multiple sensory modalities. Studies of communication through spoken utterances and gestures are well established. Nevertheless, communication through force signals is not well understood. In this paper, we focus on investigating the mechanisms employed by humans during the negotiation through force signals, which is an integral part of successful collaboration. Our objective is to use the insights to inform the design of controllers for robot assistants. Specifically, we want to enable robots to take the lead in collaboration. To achieve this goal, we conducted a study to observe how humans behave during collaborative manipulation tasks. During our preliminary data analysis, we discovered several new features that help us better understand how the interaction progresses. From these features, we identified distinct patterns in the data that indicate when a participant is expressing their intent. Our study provides valuable insight into how humans collaborate physically, which can help us design robots that behave more like humans in such scenarios
    • …
    corecore