314 research outputs found
Study on Control Methodology of Compliant Manipulation Utilizing Additional Contact with Environment
制度:新 ; 報告番号:甲3297号 ; 学位の種類:博士(工学) ; 授与年月日:2011/2/25 ; 早大学位記番号:新560
Automation and robotics for the Space Exploration Initiative: Results from Project Outreach
A total of 52 submissions were received in the Automation and Robotics (A&R) area during Project Outreach. About half of the submissions (24) contained concepts that were judged to have high utility for the Space Exploration Initiative (SEI) and were analyzed further by the robotics panel. These 24 submissions are analyzed here. Three types of robots were proposed in the high scoring submissions: structured task robots (STRs), teleoperated robots (TORs), and surface exploration robots. Several advanced TOR control interface technologies were proposed in the submissions. Many A&R concepts or potential standards were presented or alluded to by the submitters, but few specific technologies or systems were suggested
Understanding of Object Manipulation Actions Using Human Multi-Modal Sensory Data
Object manipulation actions represent an important share of the Activities of
Daily Living (ADLs). In this work, we study how to enable service robots to use
human multi-modal data to understand object manipulation actions, and how they
can recognize such actions when humans perform them during human-robot
collaboration tasks. The multi-modal data in this study consists of videos,
hand motion data, applied forces as represented by the pressure patterns on the
hand, and measurements of the bending of the fingers, collected as human
subjects performed manipulation actions. We investigate two different
approaches. In the first one, we show that multi-modal signal (motion, finger
bending and hand pressure) generated by the action can be decomposed into a set
of primitives that can be seen as its building blocks. These primitives are
used to define 24 multi-modal primitive features. The primitive features can in
turn be used as an abstract representation of the multi-modal signal and
employed for action recognition. In the latter approach, the visual features
are extracted from the data using a pre-trained image classification deep
convolutional neural network. The visual features are subsequently used to
train the classifier. We also investigate whether adding data from other
modalities produces a statistically significant improvement in the classifier
performance. We show that both approaches produce a comparable performance.
This implies that image-based methods can successfully recognize human actions
during human-robot collaboration. On the other hand, in order to provide
training data for the robot so it can learn how to perform object manipulation
actions, multi-modal data provides a better alternative
Getting the Ball Rolling: Learning a Dexterous Policy for a Biomimetic Tendon-Driven Hand with Rolling Contact Joints
Biomimetic, dexterous robotic hands have the potential to replicate much of
the tasks that a human can do, and to achieve status as a general manipulation
platform. Recent advances in reinforcement learning (RL) frameworks have
achieved remarkable performance in quadrupedal locomotion and dexterous
manipulation tasks. Combined with GPU-based highly parallelized simulations
capable of simulating thousands of robots in parallel, RL-based controllers
have become more scalable and approachable. However, in order to bring
RL-trained policies to the real world, we require training frameworks that
output policies that can work with physical actuators and sensors as well as a
hardware platform that can be manufactured with accessible materials yet is
robust enough to run interactive policies. This work introduces the biomimetic
tendon-driven Faive Hand and its system architecture, which uses tendon-driven
rolling contact joints to achieve a 3D printable, robust high-DoF hand design.
We model each element of the hand and integrate it into a GPU simulation
environment to train a policy with RL, and achieve zero-shot transfer of a
dexterous in-hand sphere rotation skill to the physical robot hand.Comment: for project website, see https://srl-ethz.github.io/get-ball-rolling/
. for video, see https://youtu.be/YahsMhqNU8o . Submitted to the 2023
IEEE-RAS International Conference on Humanoid Robot
Soft manipulators and grippers: A review
Soft robotics is a growing area of research which utilizes the compliance and adaptability of soft structures to develop highly adaptive robotics for soft interactions. One area in which soft robotics has the ability to make significant impact is in the development of soft grippers and manipulators. With an increased requirement for automation, robotics systems are required to perform task in unstructured and not well defined environments; conditions which conventional rigid robotics are not best suited. This requires a paradigm shift in the methods and materials used to develop robots such that they can adapt to and work safely in human environments. One solution to this is soft robotics, which enables soft interactions with the surroundings while maintaining the ability to apply significant force. This review paper assesses the current materials and methods, actuation methods and sensors which are used in the development of soft manipulators. The achievements and shortcomings of recent technology in these key areas are evaluated, and this paper concludes with a discussion on the potential impacts of soft manipulators on industry and society
Sensors for Robotic Hands: A Survey of State of the Art
Recent decades have seen significant progress in the field of artificial hands. Most of the
surveys, which try to capture the latest developments in this field, focused on actuation and control systems of these devices. In this paper, our goal is to provide a comprehensive survey of the sensors for artificial hands. In order to present the evolution of the field, we cover five year periods starting at the turn of the millennium. At each period, we present the robot hands with a focus on their sensor systems dividing them into categories, such as prosthetics, research devices, and industrial end-effectors.We also cover the sensors developed for robot hand usage in each era. Finally, the period between 2010 and 2015 introduces the reader to the state of the art and also hints to the future directions in the sensor development for artificial hands
Dexterous hand-arm coordinated manipulation using active body-environment contact
Abstract-Human-symbiotic humanoid robots that can perform tasks dexterously using their hands are needed in our homes, welfare facilities, and other places. To improve their task performance, we propose a motion control scheme aimed at appropriately coordinated hand and arm motions. By observing human manual tasks, we identified active body-environment contact as a kind of human manual skill and devised a motion control scheme based on it. We also analyzed the effectiveness of active body-environment contact in glass-placing and drawer-opening tasks. We validated our motion control scheme through actual tests on a prototype human-symbiotic humanoid robot
A Soft Anthropomorphic & Tactile Fingertip for Low-Cost Prosthetic & Robotic Applications
Nowadays, prosthetic and robotic hands have reached an amazing dexterity and grasping capability. However, to enhance a proper tactile 'experience', dexterity should be supported by proper sensation of daily life objects which such devices are supposed to manipulate. Here we propose a low cost anthropomorphic solution for the integration of a force sensor within a biologically inspired fingertip. A commercial force resistive sensor is embedded within a human-like soft fingertip made of silicone: the housing of the sensor - a 3D printed bay embedded within the fingertip - is analyzed via Finite Element Analysis and optimized to enhance sensor response. Experiments validate the design and proposed solution
- …