23 research outputs found

    A Continuous Grasp Representation for the Imitation Learning of Grasps on Humanoid Robots

    Get PDF
    Models and methods are presented which enable a humanoid robot to learn reusable, adaptive grasping skills. Mechanisms and principles in human grasp behavior are studied. The findings are used to develop a grasp representation capable of retaining specific motion characteristics and of adapting to different objects and tasks. Based on the representation a framework is proposed which enables the robot to observe human grasping, learn grasp representations, and infer executable grasping actions

    Human Inspired Multi-Modal Robot Touch

    Get PDF

    Design of a 3D-printed soft robotic hand with distributed tactile sensing for multi-grasp object identification

    Get PDF
    Tactile object identification is essential in environments where vision is occluded or when intrinsic object properties such as weight or stiffness need to be discriminated between. The robotic approach to this task has traditionally been to use rigid-bodied robots equipped with complex control schemes to explore different objects. However, whilst varying degrees of success have been demonstrated, these approaches are limited in their generalisability due to the complexity of the control schemes required to facilitate safe interactions with diverse objects. In this regard, Soft Robotics has garnered increased attention in the past decade due to the ability to exploit Morphological Computation through the agent's body to simplify the task by conforming naturally to the geometry of objects being explored. This exists as a paradigm shift in the design of robots since Soft Robotics seeks to take inspiration from biological solutions and embody adaptability in order to interact with the environment rather than relying on centralised computation. In this thesis, we formulate, simplify, and solve an object identification task using Soft Robotic principles. We design an anthropomorphic hand that has human-like range of motion and compliance in the actuation and sensing. The range of motion is validated through the Feix GRASP taxonomy and the Kapandji Thumb Opposition test. The hand is monolithically fabricated using multi-material 3D printing to enable the exploitation of different material properties within the same body and limit variability between samples. The hand's compliance facilitates adaptable grasping of a wide range of objects and features integrated distributed tactile sensing. We emulate the human approach of integrating information from multiple contacts and grasps of objects to discriminate between them. Two bespoke neural networks are designed to extract patterns from both the tactile data and the relationships between grasps to facilitate high classification accuracy

    Sensitive manipulation

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2007.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Includes bibliographical references (p. 161-172).This thesis presents an effective alternative to the traditional approach to robotic manipulation. In our approach, manipulation is mainly guided by tactile feedback as opposed to vision. The motivation comes from the fact that manipulating an object implies coming in contact with it, consequently, directly sensing physical contact seems more important than vision to control the interaction of the object and the robot. In this work, the traditional approach of a highly precise arm and vision system controlled by a model-based architecture is replaced by one that uses a low mechanical impedance arm with dense tactile sensing and exploration capabilities run by a behavior-based architecture. The robot OBRERO has been built to implement this approach. New tactile sensing technology has been developed and mounted on the robot's hand. These sensors are biologically inspired and present more adequate features for manipulation than those of state of the art tactile sensors. The robot's limb was built with compliant actuators, which present low mechanical impedance, to make the interaction between the robot and the environment safer than that of a traditional high-stiffness arm. A new actuator was created to fit in the hand size constraints.(cont.) The reduced precision of OBRERO's limb is compensated by the capability of exploration given by the tactile sensors, actuators and the software architecture. The success of this approach is shown by picking up objects in an unmodelled environment. This task, simple for humans, has been a challenge for robots. The robot can deal with new, unmodelled objects. OBRERO can come gently in contact, explore, lift, and place the object in a different location. It can also detect slippage and external forces acting on an object while it is held. Each one of these steps are done by using tactile feedback. This task can be done with very light objects with no fixtures and on slippery surfaces.by Eduardo Rafael Torres Jara.Ph.D

    Sensitive Manipulation

    Get PDF
    PhD thesisThis thesis presents an effective alternative to the traditionalapproach to robotic manipulation. In our approach, manipulation ismainly guided by tactile feedback as opposed to vision. Themotivation comes from the fact that manipulating an object impliescoming in contact with it, consequently, directly sensing physicalcontact seems more important than vision to control theinteraction of the object and the robot. In this work, thetraditional approach of a highly precise arm and vision systemcontrolled by a model-based architecture is replaced by one thatuses a low mechanical impedance arm with dense tactile sensing andexploration capabilities run by a behavior-based architecture.The robot OBRERO has been built to implement this approach. Newtactile sensing technology has been developed and mounted on therobot's hand. These sensors are biologically inspired and presentmore adequate features for manipulation than those of state of theart tactile sensors. The robot's limb was built with compliantactuators, which present low mechanical impedance, to make theinteraction between the robot and the environment safer than thatof a traditional high-stiffness arm. A new actuator was created tofit in the hand size constraints. The reduced precision ofOBRERO's limb is compensated by the capability of explorationgiven by the tactile sensors, actuators and the softwarearchitecture.The success of this approach is shown by picking up objects in anunmodelled environment. This task, simple for humans, has been achallenge for robots. The robot can deal with new, unmodelledobjects. OBRERO can come gently in contact, explore, lift, andplace the object in a different location. It can also detectslippage and external forces acting on an object while it is held.Each one of these steps are done by using tactile feedback. Thistask can be done with very light objects with no fixtures and onslippery surfaces

    Design, Fabrication, and Control of an Upper Arm Exoskeleton Assistive Robot

    Get PDF
    Stroke is the primary cause of permanent impairment and neurological damage in the United States and Europe. Annually, about fifteen million individuals worldwide suffer from stroke, which kills about one third of them. For many years, it was believed that major recovery can be achieved only in the first six months after a stroke. More recent research has demonstrated that even many years after a stroke, significant improvement is not out of reach. However, economic pressures, the aging population, and lack of specialists and available human resources can interrupt therapy, which impedes full recovery of patients after being discharged from hospital following initial rehabilitation. Robotic devices, and in particular portable robots that provide rehabilitation therapy at home and in clinics, are a novel way not only to optimize the cost of therapy but also to let more patients benefit from rehabilitation for a longer time. Robots used for such purposes should be smaller, lighter and more affordable than the robots currently used in clinics and hospitals. The common human-machine interaction design criteria such as work envelopes, safety, comfort, adaptability, space limitations, and weight-to-force ratio must still be taken into consideration.;In this work a light, wearable, affordable assistive robot was designed and a controller to assist with an activity of daily life (ADL) was developed. The mechanical design targeted the most vulnerable group of the society to stroke, based on the average size and age of the patients, with adjustability to accommodate a variety of individuals. The novel mechanical design avoids motion singularities and provides a large workspace for various ADLs. Unlike similar exoskeleton robots, the actuators are placed on the patient\u27s torso and the force is transmitted through a Bowden cable mechanism. Since the actuators\u27 mass does not affect the motion of the upper extremities, the robot can be more agile and more powerful. A compact novel actuation method with high power-to-weight ratio called the twisted string actuation method was used. Part of the research involved selection and testing of several string compositions and configurations to compare their suitability and to characterize their performance. Feedback sensor count and type have been carefully considered to keep the cost of the system as low as possible. A master-slave controller was designed and its performance in tracking the targeted ADL trajectory was evaluated for one degree of freedom (DOF). An outline for proposed future research will be presented

    Bringing the Physical to the Digital

    Get PDF
    This dissertation describes an exploration of digital tabletop interaction styles, with the ultimate goal of informing the design of a new model for tabletop interaction. In the context of this thesis the term digital tabletop refers to an emerging class of devices that afford many novel ways of interaction with the digital. Allowing users to directly touch information presented on large, horizontal displays. Being a relatively young field, many developments are in flux; hardware and software change at a fast pace and many interesting alternative approaches are available at the same time. In our research we are especially interested in systems that are capable of sensing multiple contacts (e.g., fingers) and richer information such as the outline of whole hands or other physical objects. New sensor hardware enable new ways to interact with the digital. When embarking into the research for this thesis, the question which interaction styles could be appropriate for this new class of devices was a open question, with many equally promising answers. Many everyday activities rely on our hands ability to skillfully control and manipulate physical objects. We seek to open up different possibilities to exploit our manual dexterity and provide users with richer interaction possibilities. This could be achieved through the use of physical objects as input mediators or through virtual interfaces that behave in a more realistic fashion. In order to gain a better understanding of the underlying design space we choose an approach organized into two phases. First, two different prototypes, each representing a specific interaction style – namely gesture-based interaction and tangible interaction – have been implemented. The flexibility of use afforded by the interface and the level of physicality afforded by the interface elements are introduced as criteria for evaluation. Each approaches’ suitability to support the highly dynamic and often unstructured interactions typical for digital tabletops is analyzed based on these criteria. In a second stage the learnings from these initial explorations are applied to inform the design of a novel model for digital tabletop interaction. This model is based on the combination of rich multi-touch sensing and a three dimensional environment enriched by a gaming physics simulation. The proposed approach enables users to interact with the virtual through richer quantities such as collision and friction. Enabling a variety of fine-grained interactions using multiple fingers, whole hands and physical objects. Our model makes digital tabletop interaction even more “natural”. However, because the interaction – the sensed input and the displayed output – is still bound to the surface, there is a fundamental limitation in manipulating objects using the third dimension. To address this issue, we present a technique that allows users to – conceptually – pick objects off the surface and control their position in 3D. Our goal has been to define a technique that completes our model for on-surface interaction and allows for “as-direct-as possible” interactions. We also present two hardware prototypes capable of sensing the users’ interactions beyond the table’s surface. Finally, we present visual feedback mechanisms to give the users the sense that they are actually lifting the objects off the surface. This thesis contributes on various levels. We present several novel prototypes that we built and evaluated. We use these prototypes to systematically explore the design space of digital tabletop interaction. The flexibility of use afforded by the interaction style is introduced as criterion alongside the user interface elements’ physicality. Each approaches’ suitability to support the highly dynamic and often unstructured interactions typical for digital tabletops are analyzed. We present a new model for tabletop interaction that increases the fidelity of interaction possible in such settings. Finally, we extend this model so to enable as direct as possible interactions with 3D data, interacting from above the table’s surface
    corecore