187 research outputs found
Design of a Multimodal Fingertip Sensor for Dynamic Manipulation
We introduce a spherical fingertip sensor for dynamic manipulation. It is
based on barometric pressure and time-of-flight proximity sensors and is
low-latency, compact, and physically robust. The sensor uses a trained neural
network to estimate the contact location and three-axis contact forces based on
data from the pressure sensors, which are embedded within the sensor's sphere
of polyurethane rubber. The time-of-flight sensors face in three different
outward directions, and an integrated microcontroller samples each of the
individual sensors at up to 200 Hz. To quantify the effect of system latency on
dynamic manipulation performance, we develop and analyze a metric called the
collision impulse ratio and characterize the end-to-end latency of our new
sensor. We also present experimental demonstrations with the sensor, including
measuring contact transitions, performing coarse mapping, maintaining a contact
force with a moving object, and reacting to avoid collisions.Comment: 6 pages, 2 pages of references, supplementary video at
https://youtu.be/HGSdcW_aans. Submitted to ICRA 202
Color-Coded Fiber-Optic Tactile Sensor for an Elastomeric Robot Skin
The sense of touch is essential for reliable mapping between the environment
and a robot which interacts physically with objects. Presumably, an artificial
tactile skin would facilitate safe interaction of the robots with the
environment. In this work, we present our color-coded tactile sensor,
incorporating plastic optical fibers (POF), transparent silicone rubber and an
off-the-shelf color camera. Processing electronics are placed away from the
sensing surface to make the sensor robust to harsh environments. Contact
localization is possible thanks to the lower number of light sources compared
to the number of camera POFs. Classical machine learning techniques and a
hierarchical classification scheme were used for contact localization.
Specifically, we generated the mapping from stimulation to sensation of a
robotic perception system using our sensor. We achieved a force sensing range
up to 18 N with the force resolution of around 3.6~N and the spatial resolution
of 8~mm. The color-coded tactile sensor is suitable for tactile exploration and
might enable further innovations in robust tactile sensing.Comment: Presented at ICRA2019, Montrea
Proximity and Visuotactile Point Cloud Fusion for Contact Patches in Extreme Deformation
Equipping robots with the sense of touch is critical to emulating the
capabilities of humans in real world manipulation tasks. Visuotactile sensors
are a popular tactile sensing strategy due to data output compatible with
computer vision algorithms and accurate, high resolution estimates of local
object geometry. However, these sensors struggle to accommodate high
deformations of the sensing surface during object interactions, hindering more
informative contact with cm-scale objects frequently encountered in the real
world. The soft interfaces of visuotactile sensors are often made of
hyperelastic elastomers, which are difficult to simulate quickly and accurately
when extremely deformed for tactile information. Additionally, many
visuotactile sensors that rely on strict internal light conditions or pattern
tracking will fail if the surface is highly deformed. In this work, we propose
an algorithm that fuses proximity and visuotactile point clouds for contact
patch segmentation that is entirely independent from membrane mechanics. This
algorithm exploits the synchronous, high-res proximity and visuotactile
modalities enabled by an extremely deformable, selectively transmissive soft
membrane, which uses visible light for visuotactile sensing and infrared light
for proximity depth. We present the hardware design, membrane fabrication, and
evaluation of our contact patch algorithm in low (10%), medium (60%), and high
(100%+) membrane strain states. We compare our algorithm against three
baselines: proximity-only, tactile-only, and a membrane mechanics model. Our
proposed algorithm outperforms all baselines with an average RMSE under 2.8mm
of the contact patch geometry across all strain ranges. We demonstrate our
contact patch algorithm in four applications: varied stiffness membranes,
torque and shear-induced wrinkling, closed loop control for whole body
manipulation, and pose estimation
Optical Proximity Sensing for Pose Estimation During In-Hand Manipulation
During in-hand manipulation, robots must be able to continuously estimate the
pose of the object in order to generate appropriate control actions. The
performance of algorithms for pose estimation hinges on the robot's sensors
being able to detect discriminative geometric object features, but previous
sensing modalities are unable to make such measurements robustly. The robot's
fingers can occlude the view of environment- or robot-mounted image sensors,
and tactile sensors can only measure at the local areas of contact. Motivated
by fingertip-embedded proximity sensors' robustness to occlusion and ability to
measure beyond the local areas of contact, we present the first evaluation of
proximity sensor based pose estimation for in-hand manipulation. We develop a
novel two-fingered hand with fingertip-embedded optical time-of-flight
proximity sensors as a testbed for pose estimation during planar in-hand
manipulation. Here, the in-hand manipulation task consists of the robot moving
a cylindrical object from one end of its workspace to the other. We
demonstrate, with statistical significance, that proximity-sensor based pose
estimation via particle filtering during in-hand manipulation: a) exhibits 50%
lower average pose error than a tactile-sensor based baseline; b) empowers a
model predictive controller to achieve 30% lower final positioning error
compared to when using tactile-sensor based pose estimates.Comment: 8 pages, 6 figure
An Embedded, Multi-Modal Sensor System for Scalable Robotic and Prosthetic Hand Fingers
Grasping and manipulation with anthropomorphic robotic and prosthetic hands presents a scientific challenge regarding mechanical design, sensor system, and control. Apart from the mechanical design of such hands, embedding sensors needed for closed-loop control of grasping tasks remains a hard problem due to limited space and required high level of integration of different components. In this paper we present a scalable design model of artificial fingers, which combines mechanical design and embedded electronics with a sophisticated multi-modal sensor system consisting of sensors for sensing normal and shear force, distance, acceleration, temperature, and joint angles. The design is fully parametric, allowing automated scaling of the fingers to arbitrary dimensions in the human hand spectrum. To this end, the electronic parts are composed of interchangeable modules that facilitate the echanical scaling of the fingers and are fully enclosed by the mechanical parts of the finger. The resulting design model allows deriving freely scalable and multimodally sensorised fingers for robotic and prosthetic hands. Four physical demonstrators are assembled and tested to evaluate the approach
An Embedded, Multi-Modal Sensor System for Scalable Robotic and Prosthetic Hand Fingers
Grasping and manipulation with anthropomorphic robotic and prosthetic hands presents a scientific challenge regarding mechanical design, sensor system, and control. Apart from the mechanical design of such hands, embedding sensors needed for closed-loop control of grasping tasks remains a hard problem due to limited space and required high level of integration of different components. In this paper we present a scalable design model of artificial fingers, which combines mechanical design and embedded electronics with a sophisticated multi-modal sensor system consisting of sensors for sensing normal and shear force, distance, acceleration, temperature, and joint angles. The design is fully parametric, allowing automated scaling of the fingers to arbitrary dimensions in the human hand spectrum. To this end, the electronic parts are composed of interchangeable modules that facilitate the echanical scaling of the fingers and are fully enclosed by the mechanical parts of the finger. The resulting design model allows deriving freely scalable and multimodally sensorised fingers for robotic and prosthetic hands. Four physical demonstrators are assembled and tested to evaluate the approach
- …