124 research outputs found
Perception of Soft Objects in Virtual Environments Under Conflicting Visual and Haptic Cues
In virtual/augmented/mixed reality (VR/AR/MR) applications, rendering soft virtual objects using a hand-held haptic device is challenging due to the anatomical restrictions of the hand and the ungrounded nature of the design, which affect the selection of actuators and sensors and hence limit the resolution and range of forces displayed by the device. We developed a cable-driven haptic device for rendering the net forces involved in grasping and squeezing 3D virtual compliant (soft) objects being held between the index finger and thumb only. Using the proposed device, we investigate the perception of soft objects in virtual environments. We show that the range of object stiffness that can be effectively conveyed to a user in virtual environments (VEs) can be significantly expanded by controlling the relationship between the visual and haptic cues. We propose that a single variable, named Apparent Stiffness Difference , can predict the pattern of human stiffness perception under manipulated conflict, which can be used for rendering a range of soft objects in VEs larger than what is achievable by a haptic device alone due to its physical limits
Intention recognition for dynamic role exchange in haptic collaboration
In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the users sense of interaction and reinforce his/her belief that the computer aids with the execution of the task. © 2013 IEEE
Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation
The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method
Numerical Simulation of Nano Scanning in Intermittent-Contact Mode AFM under Q control
We investigate nano scanning in tapping mode atomic force microscopy (AFM)
under quality (Q) control via numerical simulations performed in SIMULINK. We
focus on the simulation of whole scan process rather than the simulation of
cantilever dynamics and the force interactions between the probe tip and the
surface alone, as in most of the earlier numerical studies. This enables us to
quantify the scan performance under Q control for different scan settings.
Using the numerical simulations, we first investigate the effect of elastic
modulus of sample (relative to the substrate surface) and probe stiffness on
the scan results. Our numerical simulations show that scanning in attractive
regime using soft cantilevers with high Qeff results in a better image quality.
We, then demonstrate the trade-off in setting the effective Q factor (Qeff) of
the probe in Q control: low values of Qeff cause an increase in tapping forces
while higher ones limit the maximum achievable scan speed due to the slow
response of the cantilever to the rapid changes in surface profile. Finally, we
show that it is possible to achieve higher scan speeds without causing an
increase in the tapping forces using adaptive Q control (AQC), in which the Q
factor of the probe is changed instantaneously depending on the magnitude of
the error signal in oscillation amplitude. The scan performance of AQC is
quantitatively compared to that of standard Q control using iso-error curves
obtained from numerical simulations first and then the results are validated
through scan experiments performed using a physical set-up
Building a Open Source Framework for Virtual Medical Training
This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented
Haptography: Capturing and Recreating the Rich Feel of Real Surfaces
Haptic interfaces, which allow a user to touch virtual and remote environments through a hand-held tool, have opened up exciting new possibilities for applications such as computer-aided design and robot-assisted surgery. Unfortunately, the haptic renderings produced by these systems seldom feel like authentic re-creations of the richly varied surfaces one encounters in the real world. We have thus envisioned the new approach of haptography, or haptic photography, in which an individual quickly records a physical interaction with a real surface and then recreates that experience for a user at a different time and/or place. This paper presents an overview of the goals and methods of haptography, emphasizing the importance of accurately capturing and recreating the high frequency accelerations that occur during tool-mediated interactions. In the capturing domain, we introduce a new texture modeling and synthesis method based on linear prediction applied to acceleration signals recorded from real tool interactions. For recreating, we show a new haptography handle prototype that enables the user of a Phantom Omni to feel fine surface features and textures
- …