1,197 research outputs found

    In-home and remote use of robotic body surrogates by people with profound motor deficits

    Get PDF
    By controlling robots comparable to the human body, people with profound motor deficits could potentially perform a variety of physical tasks for themselves, improving their quality of life. The extent to which this is achievable has been unclear due to the lack of suitable interfaces by which to control robotic body surrogates and a dearth of studies involving substantial numbers of people with profound motor deficits. We developed a novel, web-based augmented reality interface that enables people with profound motor deficits to remotely control a PR2 mobile manipulator from Willow Garage, which is a human-scale, wheeled robot with two arms. We then conducted two studies to investigate the use of robotic body surrogates. In the first study, 15 novice users with profound motor deficits from across the United States controlled a PR2 in Atlanta, GA to perform a modified Action Research Arm Test (ARAT) and a simulated self-care task. Participants achieved clinically meaningful improvements on the ARAT and 12 of 15 participants (80%) successfully completed the simulated self-care task. Participants agreed that the robotic system was easy to use, was useful, and would provide a meaningful improvement in their lives. In the second study, one expert user with profound motor deficits had free use of a PR2 in his home for seven days. He performed a variety of self-care and household tasks, and also used the robot in novel ways. Taking both studies together, our results suggest that people with profound motor deficits can improve their quality of life using robotic body surrogates, and that they can gain benefit with only low-level robot autonomy and without invasive interfaces. However, methods to reduce the rate of errors and increase operational speed merit further investigation.Comment: 43 Pages, 13 Figure

    Design and development of robust hands for humanoid robots

    Get PDF
    Design and development of robust hands for humanoid robot

    Inferring Object Properties from Incidental Contact with a Tactile-Sensing Forearm

    Get PDF
    arXiv:1409.4972v1 [cs.RO]Whole-arm tactile sensing enables a robot to sense properties of contact across its entire arm. By using this large sensing area, a robot has the potential to acquire useful information from incidental contact that occurs while performing a task. Within this paper, we demonstrate that data-driven methods can be used to infer mechanical properties of objects from incidental contact with a robot’s forearm. We collected data from a tactile-sensing forearm as it made contact with various objects during a simple reaching motion. We then used hidden Markov models (HMMs) to infer two object properties (rigid vs. soft and fixed vs. movable) based on low-dimensional features of time-varying tactile sensor data (maximum force, contact area, and contact motion). A key issue is the extent to which data-driven methods can generalize to robot actions that differ from those used during training. To investigate this issue, we developed an idealized mechanical model of a robot with a compliant joint making contact with an object. This model provides intuition for the classification problem. We also conducted tests in which we varied the robot arm’s velocity and joint stiffness. We found that, in contrast to our previous methods [1], multivariate HMMs achieved high cross-validation accuracy and successfully generalized what they had learned to new robot motions with distinct velocities and joint stiffnesses

    Human-Machine Interfaces using Distributed Sensing and Stimulation Systems

    Get PDF
    As the technology moves towards more natural human-machine interfaces (e.g. bionic limbs, teleoperation, virtual reality), it is necessary to develop a sensory feedback system in order to foster embodiment and achieve better immersion in the control system. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing a wide bandwidth of information. To provide this type of feedback, it is necessary to develop a distributed sensing system that could extract a wide range of information during the interaction between the robot and the environment. In addition, a distributed feedback interface is needed to deliver such information to the user. This thesis proposes the development of a distributed sensing system (e-skin) to acquire tactile sensation, a first integration of distributed sensing system on a robotic hand, the development of a sensory feedback system that compromises the distributed sensing system and a distributed stimulation system, and finally the implementation of deep learning methods for the classification of tactile data. It\u2019s core focus addresses the development and testing of a sensory feedback system, based on the latest distributed sensing and stimulation techniques. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives, and the used methodology and contributions; as well as six studies that tackled the development of human-machine interfaces

    A force and thermal sensing skin for robots in human environments

    Get PDF
    Working together, heated and unheated temperature sensors can recognize contact with different materials and contact with the human body. As such, distributing these sensors across a robot’s body could be beneficial for operation in human environments. We present a stretchable fabric-based skin with force and thermal sensors that is suitable for covering areas of a robot’s body, including curved surfaces. It also adds a layer of compliance that conforms to manipulated objects, improving thermal sensing. Our design addresses thermal sensing challenges, such as the time to heat the sensors, the efficiency of sensing, and the distribution of sensors across the skin. It incorporates small self-heated temperature sensors on the surface of the skin that directly make contact with objects, improving the sensors’ response times. Our approach seeks to fully cover the robot’s body with large force sensing taxels, but treats temperature sensors as small, point-like sensors sparsely distributed across the skin. We present a mathematical model to help predict how many of these point-like temperature sensors should be used in order to increase the likelihood of them making contact with an object. To evaluate our design, we conducted tests in which a robot arm used a cylindrical end effector covered with skin to slide objects and press on objects made from four different materials. After assessing the safety of our design, we also had the robot make contact with the forearms and clothed shoulders of 10 human participants. With 2.0 s of contact, the actively-heated temperature sensors enabled binary classification accuracy over 90% for the majority of material pairs. The system could more rapidly distinguish between materials with large differences in their thermal effusivities (e.g., 90% accuracy for pine wood vs. aluminum with 0.5 s of contact). For discrimination between humans vs. the four materials, the skin’s force and thermal sensing modalities achieved 93% classification accuracy with 0.5 s of contact. Overall, our results suggest that our skin design could enable robots to recognize contact with distinct task-relevant materials and humans while performing manipulation tasks in human environments.M.S
    • …
    corecore