3,851 research outputs found
Recommended from our members
Pictures in Your Mind: Using Interactive Gesture-Controlled Reliefs to Explore Art
Tactile reliefs offer many benefits over the more classic raised line drawings or tactile diagrams, as depth, 3D shape, and surface textures are directly perceivable. Although often created for blind and visually impaired (BVI) people, a wider range of people may benefit from such multimodal material. However, some reliefs are still difficult to understand without proper guidance or accompanying verbal descriptions, hindering autonomous exploration.
In this work, we present a gesture-controlled interactive audio guide (IAG) based on recent low-cost depth cameras that can be operated directly with the hands on relief surfaces during tactile exploration. The interactively explorable, location-dependent verbal and captioned descriptions promise rapid tactile accessibility to 2.5D spatial information in a home or education setting, to online resources, or as a kiosk installation at public places.
We present a working prototype, discuss design decisions, and present the results of two evaluation studies: the first with 13 BVI test users and the second follow-up study with 14 test users across a wide range of people with differences and difficulties associated with perception, memory, cognition, and communication. The participant-led research method of this latter study prompted new, significant and innovative developments
Vision- and tactile-based continuous multimodal intention and attention recognition for safer physical human-robot interaction
Employing skin-like tactile sensors on robots enhances both the safety and
usability of collaborative robots by adding the capability to detect human
contact. Unfortunately, simple binary tactile sensors alone cannot determine
the context of the human contact -- whether it is a deliberate interaction or
an unintended collision that requires safety manoeuvres. Many published methods
classify discrete interactions using more advanced tactile sensors or by
analysing joint torques. Instead, we propose to augment the intention
recognition capabilities of simple binary tactile sensors by adding a
robot-mounted camera for human posture analysis. Different interaction
characteristics, including touch location, human pose, and gaze direction, are
used to train a supervised machine learning algorithm to classify whether a
touch is intentional or not with an F1-score of 86%. We demonstrate that
multimodal intention recognition is significantly more accurate than monomodal
analyses with the collaborative robot Baxter. Furthermore, our method can also
continuously monitor interactions that fluidly change between intentional or
unintentional by gauging the user's attention through gaze. If a user stops
paying attention mid-task, the proposed intention and attention recognition
algorithm can activate safety features to prevent unsafe interactions. We also
employ a feature reduction technique that reduces the number of inputs to five
to achieve a more generalized low-dimensional classifier. This simplification
both reduces the amount of training data required and improves real-world
classification accuracy. It also renders the method potentially agnostic to the
robot and touch sensor architectures while achieving a high degree of task
adaptability.Comment: 11 pages, 8 figures, preprint under revie
Tactile sensors for robot handling
First and second generation robots have been used cost effectively in highâvolume âfixedâ or âhardâ automated manufacturing/assembly systems. They are âlimitedâabilityâ devices using simple logic elements or primitive sensory feedback. However, in the unstructured environment of most manufacturing plants it is often necessary to locate, identify, orientate and position randomly presented components.
Visual systems have been researched and developed to provide a coarse resolution outline of objects. More detailed and precise definition of parts is usually obtained by high resolution tactile sensing arrays. This paper reviews and discusses the current state of the art in tactile sensing
Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors
Despite the utility of tactile information, tactile sensors have yet to be
widely deployed in industrial robotics settings -- part of the challenge lies
in identifying slip and other key events from the tactile data stream. In this
paper, we present a learning-based method to detect slip using barometric
tactile sensors. Although these sensors have a low resolution, they have many
other desirable properties including high reliability and durability, a very
slim profile, and a low cost. We are able to achieve slip detection accuracies
of greater than 91% while being robust to the speed and direction of the slip
motion. Further, we test our detector on two robot manipulation tasks involving
common household objects and demonstrate successful generalization to
real-world scenarios not seen during training. We show that barometric tactile
sensing technology, combined with data-driven learning, is potentially suitable
for complex manipulation tasks such as slip compensation.Comment: Submitted to th RoboTac Workshop in the IEEE/RSJ International
Conference on Intelligent Robotics and Systems (IROS'21), Prague, Czech
Republic, Sept 27- Oct 1, 202
Sensory augmentation with distal touch: The tactile helmet project
The Tactile Helmet is designed to augment a wearer's senses with a long range sense of touch. Tactile specialist animals such as rats and mice are capable of rapidly acquiring detailed information about their environment from their whiskers by using task-sensitive strategies. Providing similar information about the nearby environment, in tactile form, to a human operator could prove invaluable for search and rescue operations, or for partially-sighted people. Two key aspects of the Tactile Helmet are sensory augmentation, and active sensing. A haptic display is used to provide the user with ultrasonic range information. This can be interpreted in addition to, rather than instead of, visual or auditory information. Active sensing systems "are purposive and information-seeking sensory systems, involving task specific control of the sensory apparatus" [1]. The integration of an accelerometer allows the device to actively gate the delivery of sensory information to the user, depending on their movement. Here we describe the hardware, sensory transduction and characterisation of the Tactile Helmet device, before outlining potential use cases and benefits of the system. © 2013 Springer-Verlag Berlin Heidelberg
Recommended from our members
The role of HG in the analysis of temporal iteration and interaural correlation
- âŠ