6 research outputs found
Pushing in the Dark: A Reactive Pushing Strategy for Mobile Robots Using Tactile Feedback
For mobile robots, navigating cluttered or dynamic environments often
necessitates non-prehensile manipulation, particularly when faced with objects
that are too large, irregular, or fragile to grasp. The unpredictable behavior
and varying physical properties of these objects significantly complicate
manipulation tasks. To address this challenge, this manuscript proposes a novel
Reactive Pushing Strategy. This strategy allows a mobile robot to dynamically
adjust its base movements in real-time to achieve successful pushing maneuvers
towards a target location. Notably, our strategy adapts the robot motion based
on changes in contact location obtained through the tactile sensor covering the
base, avoiding dependence on object-related assumptions and its modeled
behavior. The effectiveness of the Reactive Pushing Strategy was initially
evaluated in the simulation environment, where it significantly outperformed
the compared baseline approaches. Following this, we validated the proposed
strategy through real-world experiments, demonstrating the robot capability to
push objects to the target points located in the entire vicinity of the robot.
In both simulation and real-world experiments, the object-specific properties
(shape, mass, friction, inertia) were altered along with the changes in target
locations to assess the robustness of the proposed method comprehensively.Comment: 8 pages, 7 figures, submitted to IEEE Robotics and Automation
Letters, for associated video, see https://youtu.be/IuGxlNe246
Enhancing Human-Robot Collaboration Transportation through Obstacle-Aware Vibrotactile Feedback
Transporting large and heavy objects can benefit from Human-Robot
Collaboration (HRC), increasing the contribution of robots to our daily tasks
and reducing the risk of injuries to the human operator. This approach usually
posits the human collaborator as the leader, while the robot has the follower
role. Hence, it is essential for the leader to be aware of the environmental
situation. However, when transporting a large object, the operator's
situational awareness can be compromised as the object may occlude different
parts of the environment. This paper proposes a novel haptic-based
environmental awareness module for a collaborative transportation framework
that informs the human operator about surrounding obstacles. The robot uses two
LIDARs to detect the obstacles in the surroundings. The warning module alerts
the operator through a haptic belt with four vibrotactile devices that provide
feedback about the location and proximity of the obstacles. By enhancing the
operator's awareness of the surroundings, the proposed module improves the
safety of the human-robot team in co-carrying scenarios by preventing
collisions. Experiments with two non-expert subjects in two different
situations are conducted. The results show that the human partner can
successfully lead the co-transportation system in an unknown environment with
hidden obstacles thanks to the haptic feedback.Comment: 6 pages, 5 figures, for associated video, see this
https://youtu.be/UABeGPIIrH
Robot-Assisted Navigation for Visually Impaired through Adaptive Impedance and Path Planning
This paper presents a framework to navigate visually impaired people through
unfamiliar environments by means of a mobile manipulator. The Human-Robot
system consists of three key components: a mobile base, a robotic arm, and the
human subject who gets guided by the robotic arm via physically coupling their
hand with the cobot's end-effector. These components, receiving a goal from the
user, traverse a collision-free set of waypoints in a coordinated manner, while
avoiding static and dynamic obstacles through an obstacle avoidance unit and a
novel human guidance planner. With this aim, we also present a legs tracking
algorithm that utilizes 2D LiDAR sensors integrated into the mobile base to
monitor the human pose. Additionally, we introduce an adaptive pulling planner
responsible for guiding the individual back to the intended path if they veer
off course. This is achieved by establishing a target arm end-effector position
and dynamically adjusting the impedance parameters in real-time through a
impedance tuning unit. To validate the framework we present a set of
experiments both in laboratory settings with 12 healthy blindfolded subjects
and a proof-of-concept demonstration in a real-world scenario.Comment: 7 pages, 7 figures, submitted to IEEE International Conference on
Robotics and Automation, for associated video, see
https://youtu.be/B94n3QjdnJ
Step-Change in Friction under Electrovibration
Rendering tactile effects on a touch screen via electrovibration has many potential applications. However, our knowledge on tactile perception of change in friction and the underlying contact mechanics are both very limited. In this study, we investigate the tactile perception and the contact mechanics for a step change in friction under electrovibration during a relative sliding between a finger and the surface of a capacitive touchscreen. First, we conduct magnitude estimation experiments to investigate the role of normal force and sliding velocity on the perceived tactile intensity for a step increase and decrease in friction, called rising friction (RF) and falling friction (FF). To investigate the contact mechanics involved in RF and FF, we then measure the frictional force, the apparent contact area, and the strains acting on the fingerpad during sliding at a constant velocity under three different normal loads using a custom-made experimental setup. The results show that the participants perceived RF stronger than FF, and both the normal force and sliding velocity significantly influenced their perception. These results are supported by our mechanical measurements; the relative change in friction, the apparent contact area, and the strain in the sliding direction were all higher for RF than those for FF, especially for low normal forces. Taken together, our results suggest that different contact mechanics take place during RF and FF due to the viscoelastic behavior of fingerpad skin, and those differences influence our tactile perception of a step change in friction