15 research outputs found

    Learning to Navigate Cloth using Haptics

    Full text link
    We present a controller that allows an arm-like manipulator to navigate deformable cloth garments in simulation through the use of haptic information. The main challenge of such a controller is to avoid getting tangled in, tearing or punching through the deforming cloth. Our controller aggregates force information from a number of haptic-sensing spheres all along the manipulator for guidance. Based on haptic forces, each individual sphere updates its target location, and the conflicts that arise between this set of desired positions is resolved by solving an inverse kinematic problem with constraints. Reinforcement learning is used to train the controller for a single haptic-sensing sphere, where a training run is terminated (and thus penalized) when large forces are detected due to contact between the sphere and a simplified model of the cloth. In simulation, we demonstrate successful navigation of a robotic arm through a variety of garments, including an isolated sleeve, a jacket, a shirt, and shorts. Our controller out-performs two baseline controllers: one without haptics and another that was trained based on large forces between the sphere and cloth, but without early termination.Comment: Supplementary video available at https://youtu.be/iHqwZPKVd4A. Related publications http://www.cc.gatech.edu/~karenliu/Robotic_dressing.htm

    Deep Haptic Model Predictive Control for Robot-Assisted Dressing

    Full text link
    Robot-assisted dressing offers an opportunity to benefit the lives of many people with disabilities, such as some older adults. However, robots currently lack common sense about the physical implications of their actions on people. The physical implications of dressing are complicated by non-rigid garments, which can result in a robot indirectly applying high forces to a person's body. We present a deep recurrent model that, when given a proposed action by the robot, predicts the forces a garment will apply to a person's body. We also show that a robot can provide better dressing assistance by using this model with model predictive control. The predictions made by our model only use haptic and kinematic observations from the robot's end effector, which are readily attainable. Collecting training data from real world physical human-robot interaction can be time consuming, costly, and put people at risk. Instead, we train our predictive model using data collected in an entirely self-supervised fashion from a physics-based simulation. We evaluated our approach with a PR2 robot that attempted to pull a hospital gown onto the arms of 10 human participants. With a 0.2s prediction horizon, our controller succeeded at high rates and lowered applied force while navigating the garment around a persons fist and elbow without getting caught. Shorter prediction horizons resulted in significantly reduced performance with the sleeve catching on the participants' fists and elbows, demonstrating the value of our model's predictions. These behaviors of mitigating catches emerged from our deep predictive model and the controller objective function, which primarily penalizes high forces.Comment: 8 pages, 12 figures, 1 table, 2018 IEEE International Conference on Robotics and Automation (ICRA

    Application of Mimosa Pudica Mechanoreceptors to Electronic Skin Design

    Get PDF
    Mechanoreceptor cells in Mimosa Pudica – a plant known for its rapid leaf movements when touched – can be used as a possible tactile sensor to create a flexible, high-resolution electronic skin. To test the viability of this solution, mechanoreceptor cells have been incorporated into a hydrogel to ensure their mechanosensitive properties are retained after isolation. Gelling agent is used to solidify a culture of the mechanosensitive cells onto a microelectrode array. This allows the monitoring of electrical responses to an applied mechanical stimulus. It is shown that more experimentation must be done in order to prove that these cells do retain their ability to transduce mechanical stimulus into an electrical response, and thus would serve as a viable tool in future electronic skin designs. Moving forward, calcium imaging will be used to optically characterize the response of the cells in terms of the firing of action potentials upon mechanical stimulation. This will also be used to observe if the cells are in fact firing in response to the stimulus. This will also allow for a specific determination of the cells responsible for the electrical response. Additional tests may be run to test how well the material localizes electrical responses to areas of stimulation by employing multiple channels of the microelectrode array. This simple bio-complex material has the potential to provide the basis for a larger scale, complex electronic skin that may be used in tactile sensing prosthetics, soft robotics, and smart materials.The Ohio State University Materials Research Seed Grant ProgramThe Center for Emergent MaterialsNSF-MRSEC grant DMR-1420451The Center for Exploration of Novel Complex MaterialsThe Institute for Materials ResearchOSU Undergraduate Education Summer Research FellowshipOSU Undergraduate Honors Research ScholarshipNo embargoAcademic Major: Electrical and Computer Engineerin

    In-home and remote use of robotic body surrogates by people with profound motor deficits

    Get PDF
    By controlling robots comparable to the human body, people with profound motor deficits could potentially perform a variety of physical tasks for themselves, improving their quality of life. The extent to which this is achievable has been unclear due to the lack of suitable interfaces by which to control robotic body surrogates and a dearth of studies involving substantial numbers of people with profound motor deficits. We developed a novel, web-based augmented reality interface that enables people with profound motor deficits to remotely control a PR2 mobile manipulator from Willow Garage, which is a human-scale, wheeled robot with two arms. We then conducted two studies to investigate the use of robotic body surrogates. In the first study, 15 novice users with profound motor deficits from across the United States controlled a PR2 in Atlanta, GA to perform a modified Action Research Arm Test (ARAT) and a simulated self-care task. Participants achieved clinically meaningful improvements on the ARAT and 12 of 15 participants (80%) successfully completed the simulated self-care task. Participants agreed that the robotic system was easy to use, was useful, and would provide a meaningful improvement in their lives. In the second study, one expert user with profound motor deficits had free use of a PR2 in his home for seven days. He performed a variety of self-care and household tasks, and also used the robot in novel ways. Taking both studies together, our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates, and that they can gain benefit with only low-level robot autonomy and without invasive interfaces. However, methods to reduce the rate of errors and increase operational speed merit further investigation.Comment: 43 Pages, 13 Figure

    Mems sensors controlled haptic forefinger robotic aid

    Get PDF
    The ability to feel the world through the tools we hold is Haptic Touch. The concept of sensory elements transforming information into touch experience by interacting with things remotely is motivating and challenging. This paper deals with the design and implementation of fore finger direction based robot for physically challenged people, which follows the direction of the Forefinger. The path way of the robot may be either point-to-point or continuous. This sensor detects the direction of the forefinger and the output is transmitted via RF transmitter to the receiver unit. In the receiver section RF receiver which receives corresponding signal will command the microcontroller to move the robot in that particular direction. The design of the system includes microcontroller, MEMS sensor and RF technology. The robot system receives the command from the MEMS sensor which is placed on the fore finger at the transmitter section. Therefore the simple control mechanism of the robot is shown. Experimental results for fore finger based directional robot are enumerated
    corecore