154 research outputs found

    Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation

    Get PDF
    The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method

    Numerical Simulation of Nano Scanning in Intermittent-Contact Mode AFM under Q control

    Full text link
    We investigate nano scanning in tapping mode atomic force microscopy (AFM) under quality (Q) control via numerical simulations performed in SIMULINK. We focus on the simulation of whole scan process rather than the simulation of cantilever dynamics and the force interactions between the probe tip and the surface alone, as in most of the earlier numerical studies. This enables us to quantify the scan performance under Q control for different scan settings. Using the numerical simulations, we first investigate the effect of elastic modulus of sample (relative to the substrate surface) and probe stiffness on the scan results. Our numerical simulations show that scanning in attractive regime using soft cantilevers with high Qeff results in a better image quality. We, then demonstrate the trade-off in setting the effective Q factor (Qeff) of the probe in Q control: low values of Qeff cause an increase in tapping forces while higher ones limit the maximum achievable scan speed due to the slow response of the cantilever to the rapid changes in surface profile. Finally, we show that it is possible to achieve higher scan speeds without causing an increase in the tapping forces using adaptive Q control (AQC), in which the Q factor of the probe is changed instantaneously depending on the magnitude of the error signal in oscillation amplitude. The scan performance of AQC is quantitatively compared to that of standard Q control using iso-error curves obtained from numerical simulations first and then the results are validated through scan experiments performed using a physical set-up

    Intention recognition for dynamic role exchange in haptic collaboration

    No full text
    In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the users sense of interaction and reinforce his/her belief that the computer aids with the execution of the task. © 2013 IEEE

    Perception of Soft Objects in Virtual Environments Under Conflicting Visual and Haptic Cues

    Get PDF
    In virtual/augmented/mixed reality (VR/AR/MR) applications, rendering soft virtual objects using a hand-held haptic device is challenging due to the anatomical restrictions of the hand and the ungrounded nature of the design, which affect the selection of actuators and sensors and hence limit the resolution and range of forces displayed by the device. We developed a cable-driven haptic device for rendering the net forces involved in grasping and squeezing 3D virtual compliant (soft) objects being held between the index finger and thumb only. Using the proposed device, we investigate the perception of soft objects in virtual environments. We show that the range of object stiffness that can be effectively conveyed to a user in virtual environments (VEs) can be significantly expanded by controlling the relationship between the visual and haptic cues. We propose that a single variable, named Apparent Stiffness Difference , can predict the pattern of human stiffness perception under manipulated conflict, which can be used for rendering a range of soft objects in VEs larger than what is achievable by a haptic device alone due to its physical limits

    Using Haptics to Convey Cause-and-Effect Relations in Climate Visualization

    Full text link

    A survey of haptics in serious gaming

    Get PDF
    Serious gaming often requires high level of realism for training and learning purposes. Haptic technology has been proved to be useful in many applications with an additional perception modality complementary to the audio and the vision. It provides novel user experience to enhance the immersion of virtual reality with a physical control-layer. This survey focuses on the haptic technology and its applications in serious gaming. Several categories of related applications are listed and discussed in details, primarily on haptics acts as cognitive aux and main component in serious games design. We categorize haptic devices into tactile, force feedback and hybrid ones to suit different haptic interfaces, followed by description of common haptic gadgets in gaming. Haptic modeling methods, in particular, available SDKs or libraries either for commercial or academic usage, are summarized. We also analyze the existing research difficulties and technology bottleneck with haptics and foresee the future research directions

    From presence to consciousness through virtual reality

    Get PDF
    Immersive virtual environments can break the deep, everyday connection between where our senses tell us we are and where we are actually located and whom we are with. The concept of 'presence' refers to the phenomenon of behaving and feeling as if we are in the virtual world created by computer displays. In this article, we argue that presence is worthy of study by neuroscientists, and that it might aid the study of perception and consciousness

    The Role of Simulation Fidelity in Laparoscopic Surgical Training

    Full text link
    corecore