328 research outputs found

    Tactile Roughness Perception of Virtual Gratings by Electrovibration

    Full text link
    Realistic display of tactile textures on touch screens is a big step forward for haptic technology to reach a wide range of consumers utilizing electronic devices on a daily basis. Since the texture topography cannot be rendered explicitly by electrovibration on touch screens, it is important to understand how we perceive the virtual textures displayed by friction modulation via electrovibration. We investigated the roughness perception of real gratings made of plexiglass and virtual gratings displayed by electrovibration through a touch screen for comparison. In particular, we conducted two psychophysical experiments with 10 participants to investigate the effect of spatial period and the normal force applied by finger on roughness perception of real and virtual gratings in macro size. We also recorded the contact forces acting on the participants' finger during the experiments. The results showed that the roughness perception of real and virtual gratings are different. We argue that this difference can be explained by the amount of fingerpad penetration into the gratings. For real gratings, penetration increased tangential forces acting on the finger, whereas for virtual ones where skin penetration is absent, tangential forces decreased with spatial period. Supporting our claim, we also found that increasing normal force increases the perceived roughness of real gratings while it causes an opposite effect for the virtual gratings. These results are consistent with the tangential force profiles recorded for both real and virtual gratings. In particular, the rate of change in tangential force (dFt/dtdF_t/dt) as a function of spatial period and normal force followed trends similar to those obtained for the roughness estimates of real and virtual gratings, suggesting that it is a better indicator of the perceived roughness than the tangential force magnitude.Comment: Manuscript received June 25, 2019; revised November 15, 2019; accepted December 11, 201

    Predictive input delay compensation for motion control systems

    Get PDF
    This paper presents an analytical approach for the prediction of future motion to be used in input delay compensation of time-delayed motion control systems. The method makes use of the current and previous input values given to a nominally behaving system in order to realize the prediction of the future motion of that system. The generation of the future input is made through an integration which is realized in discrete time setting. Once the future input signal is created, it is used as the reference input of the remote system to enforce an input time delayed system, conduct a delay-free motion. Following the theoretical formulation, the proposed method is tested in experiments and the validity of the approach is verified

    Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation

    Get PDF
    The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method

    Haptics for the development of fundamental rhythm skills, including multi-limb coordination

    Get PDF
    This chapter considers the use of haptics for learning fundamental rhythm skills, including skills that depend on multi-limb coordination. Different sensory modalities have different strengths and weaknesses for the development of skills related to rhythm. For example, vision has low temporal resolution and performs poorly for tracking rhythms in real-time, whereas hearing is highly accurate. However, in the case of multi-limbed rhythms, neither hearing nor sight are particularly well suited to communicating exactly which limb does what and when, or how the limbs coordinate. By contrast, haptics can work especially well in this area, by applying haptic signals independently to each limb. We review relevant theories, including embodied interaction and biological entrainment. We present a range of applications of the Haptic Bracelets, which are computer-controlled wireless vibrotactile devices, one attached to each wrist and ankle. Haptic pulses are used to guide users in playing rhythmic patterns that require multi-limb coordination. One immediate aim of the system is to support the development of practical rhythm skills and multi-limb coordination. A longer-term goal is to aid the development of a wider range of fundamental rhythm skills including recognising, identifying, memorising, retaining, analysing, reproducing, coordinating, modifying and creating rhythms – particularly multi-stream (i.e. polyphonic) rhythmic sequences. Empirical results are presented. We reflect on related work, and discuss design issues for using haptics to support rhythm skills. Skills of this kind are essential not just to drummers and percussionists but also to keyboards players, and more generally to all musicians who need a firm grasp of rhythm

    Intention recognition for dynamic role exchange in haptic collaboration

    No full text
    In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the users sense of interaction and reinforce his/her belief that the computer aids with the execution of the task. © 2013 IEEE

    Resolving conflicts during human-robot co-manipulation

    Get PDF
    UK Research and Innovation, UKRI: EP/S033718/2, EP/T022493/1, EP/V00784XThis work is partially funded by UKRI and CHIST-ERA (HEAP: EP/S033718/2; Horizon: EP/T022493/1; TAS Hub: EP/V00784X).This paper proposes a machine learning (ML) approach to detect and resolve motion conflicts that occur between a human and a proactive robot during the execution of a physically collaborative task. We train a random forest classifier to distinguish between harmonious and conflicting human-robot interaction behaviors during object co-manipulation. Kinesthetic information generated through the teamwork is used to describe the interactive quality of collaboration. As such, we demonstrate that features derived from haptic (force/torque) data are sufficient to classify if the human and the robot harmoniously manipulate the object or they face a conflict. A conflict resolution strategy is implemented to get the robotic partner to proactively contribute to the task via online trajectory planning whenever interactive motion patterns are harmonious, and to follow the human lead when a conflict is detected. An admittance controller regulates the physical interaction between the human and the robot during the task. This enables the robot to follow the human passively when there is a conflict. An artificial potential field is used to proactively control the robot motion when partners work in harmony. An experimental study is designed to create scenarios involving harmonious and conflicting interactions during collaborative manipulation of an object, and to create a dataset to train and test the random forest classifier. The results of the study show that ML can successfully detect conflicts and the proposed conflict resolution mechanism reduces human force and effort significantly compared to the case of a passive robot that always follows the human partner and a proactive robot that cannot resolve conflicts. © 2023 Copyright is held by the owner/author(s).2-s2.0-8515037875

    An Optoelectromechanical Tactile Sensor for Detection of Breast Lumps

    Full text link

    Adaptive human force scaling via admittance control for physical human-robot interaction

    Get PDF
    The goal of this article is to design an admittance controller for a robot to adaptively change its contribution to a collaborative manipulation task executed with a human partner to improve the task performance. This has been achieved by adaptive scaling of human force based on her/his movement intention while paying attention to the requirements of different task phases. In our approach, movement intentions of human are estimated from measured human force and velocity of manipulated object, and converted to a quantitative value using a fuzzy logic scheme. This value is then utilized as a variable gain in an admittance controller to adaptively adjust the contribution of robot to the task without changing the admittance time constant. We demonstrate the benefits of the proposed approach by a pHRI experiment utilizing Fitts’ reaching movement task. The results of the experiment show that there is a) an optimum admittance time constant maximizing the human force amplification and b) a desirable admittance gain profile which leads to a more effective co-manipulation in terms of overall task performance.WOS:000731146900006Scopus - Affiliation ID: 60105072Q2ArticleUluslararası işbirliği ile yapılan - EVETOctober2021YÖK - 2021-22Eki

    In Contact:Pinching, Squeezing and Twisting for Mediated Social Touch

    Get PDF
    corecore