2,679 research outputs found
Fast and Continuous Foothold Adaptation for Dynamic Locomotion through CNNs
Legged robots can outperform wheeled machines for most navigation tasks
across unknown and rough terrains. For such tasks, visual feedback is a
fundamental asset to provide robots with terrain-awareness. However, robust
dynamic locomotion on difficult terrains with real-time performance guarantees
remains a challenge. We present here a real-time, dynamic foothold adaptation
strategy based on visual feedback. Our method adjusts the landing position of
the feet in a fully reactive manner, using only on-board computers and sensors.
The correction is computed and executed continuously along the swing phase
trajectory of each leg. To efficiently adapt the landing position, we implement
a self-supervised foothold classifier based on a Convolutional Neural Network
(CNN). Our method results in an up to 200 times faster computation with respect
to the full-blown heuristics. Our goal is to react to visual stimuli from the
environment, bridging the gap between blind reactive locomotion and purely
vision-based planning strategies. We assess the performance of our method on
the dynamic quadruped robot HyQ, executing static and dynamic gaits (at speeds
up to 0.5 m/s) in both simulated and real scenarios; the benefit of safe
foothold adaptation is clearly demonstrated by the overall robot behavior.Comment: 9 pages, 11 figures. Accepted to RA-L + ICRA 2019, January 201
Virtual Reality Games for Motor Rehabilitation
This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion
Advances in Human-Robot Handshaking
The use of social, anthropomorphic robots to support humans in various
industries has been on the rise. During Human-Robot Interaction (HRI),
physically interactive non-verbal behaviour is key for more natural
interactions. Handshaking is one such natural interaction used commonly in many
social contexts. It is one of the first non-verbal interactions which takes
place and should, therefore, be part of the repertoire of a social robot. In
this paper, we explore the existing state of Human-Robot Handshaking and
discuss possible ways forward for such physically interactive behaviours.Comment: Accepted at The 12th International Conference on Social Robotics
(ICSR 2020) 12 Pages, 1 Figur
I-BaR: Integrated Balance Rehabilitation Framework
Neurological diseases are observed in approximately one billion people
worldwide. A further increase is foreseen at the global level as a result of
population growth and aging. Individuals with neurological disorders often
experience cognitive, motor, sensory, and lower extremity dysfunctions. Thus,
the possibility of falling and balance problems arise due to the postural
control deficiencies that occur as a result of the deterioration in the
integration of multi-sensory information. We propose a novel rehabilitation
framework, Integrated Balance Rehabilitation (I-BaR), to improve the
effectiveness of the rehabilitation with objective assessment, individualized
therapy, convenience with different disability levels and adoption of an
assist-as-needed paradigm and, with an integrated rehabilitation process as a
whole, i.e., ankle-foot preparation, balance, and stepping phases,
respectively. Integrated Balance Rehabilitation allows patients to improve
their balance ability by providing multi-modal feedback: visual via utilization
of Virtual Reality; vestibular via anteroposterior and mediolateral
perturbations with the robotic platform; proprioceptive via haptic feedback.Comment: 37 pages, 2 figures, journal pape
Wearable Vibrotactile Haptic Device for Stiffness Discrimination during Virtual Interactions
In this paper, we discuss the development of cost effective, wireless, and wearable vibrotactile haptic device for stiffness perception during an interaction with virtual objects. Our experimental setup consists of haptic device with five vibrotactile actuators, virtual
reality environment tailored in Unity 3D integrating the Oculus Rift Head Mounted Display (HMD) and the Leap Motion controller. The virtual environment is able to capture touch inputs from users. Interaction forces are then rendered at 500 Hz and fed back to the wearable setup stimulating fingertips with ERM vibrotactile actuators. Amplitude and frequency of vibrations are modulated proportionally to the interaction force to simulate the stiffness of a virtual object. A quantitative and qualitative study is done to compare the discrimination of stiffness on virtual linear spring in three sensory modalities: visual only feedback, tactile only feedback, and their combination. A common psychophysics method called the Two Alternative Forced Choice (2AFC) approach is used for quantitative analysis using Just Noticeable Difference (JND) and Weber Fractions (WF). According to the psychometric experiment result, average Weber fraction values of 0.39 for visual only feedback was improved to 0.25 by adding the tactile feedback
- …