5 research outputs found

    Personalized Assistance for Dressing Users

    Full text link
    Abstract. In this paper, we present an approach for a robot to provide personalized assistance for dressing a user. In particular, given a dressing task, our approach finds a solution involving manipulator motions and also user repositioning requests. Specifically, the solution allows the robot and user to take turns moving in the same space and is cognizant of the user’s limitations. To accomplish this, a vision module monitors the human’s motion, determines if he is following the repositioning requests, and infers mobility limitations when he cannot. The learned constraints are used during future dressing episodes to personalize the repositioning requests. Our contributions include a turn-taking approach to human-robot coordination for the dressing problem and a vision module capable of learning user limitations. After presenting the technical details of our approach, we provide an evaluation with a Baxter manipulator

    Personalized Robot-assisted Dressing using User Modeling in Latent Spaces

    No full text
    Robots have the potential to provide tremendous support to disabled and elderly people in their everyday tasks, such as dressing. Many recent studies on robotic dressing assistance usually view dressing as a trajectory planning problem. However, the user movements during the dressing process are rarely taken into account, which often leads to the failures of the planned trajectory and may put the user at risk. The main difficulty of taking user movements into account is caused by severe occlusions created by the robot, the user, and the clothes during the dressing process, which prevent vision sensors from accurately detecting the postures of the user in real time. In this paper, we address this problem by introducing an approach that allows the robot to automatically adapt its motion according to the force applied on the robot's gripper caused by user movements. There are two main contributions introduced in this paper: 1) the use of a hierarchical multi-task control strategy to automatically adapt the robot motion and minimize the force applied between the user and the robot caused by user movements; 2) the online update of the dressing trajectory based on the user movement limitations modeled with the Gaussian Process Latent Variable Model in a latent space, and the density information extracted from such latent space. The combination of these two contributions leads to a personalized dressing assistance that can cope with unpredicted user movements during the dressing while constantly minimizing the force that the robot may apply on the user. The experimental results demonstrate that the proposed method allows the Baxter humanoid robot to provide personalized dressing assistance for human users with simulated upper-body impairments

    User posture recognition for robot-assisted shoe dressing task

    Get PDF
    Grau en Enginyeria en Tecnologies Industrials. Institut de Robòtica i Informàtica Industrial (IRI), CSIC-UPC. Universitat Politècnica de Catalunya; Escola Tècnica Superior d’Enginyeria Industrial de Barcelona (ETSEIB).Assistive robotics is a fast developing field, where a lot of research effort is invested towards the applications in healthcare domain. So far, the number of commercially available robots is low, and one of the reasons is robots’ limited ability to interact with users in safe and natural, human-like manner. This work focuses on development of a robot dressing assistant, more specifically its ability to track the user and recognize his/her intention to be dressed. The work is performed under the framework of the I-DRESS project, which aims to develop a robot able to provide proactive assistance with dressing to users with reduced mobility. The proposed system consists of a Barrett WAM robot manipulator and a Microsoft XBOX ONE Kinect Sensor V2.0 Camera Sensor (popularly known as Kinect 2, and will be denominated as such in the rest of this document), which provides user tracking from depth images. The integration of hardware and algorithms was performed in Robot Operating System (ROS). All developments and experiments were done in the laboratory of the Perception and Manipulation Group, at the Institut de Robòtica i Informàtica Industrial (IRI), CSIC-UPC.Peer Reviewe

    Automation strategies for sample preparation in life science applications

    Get PDF
    Automation is broadly applied in life science field, with robots playing critical roles. In this dissertation, a platform based on a Yaskawa industrial dual-arm robot (CSDA10F) is presented, which is to automate the sample preparation processes and to integrate analytical instruments. A user-friendly interface has been provided by integrating the platform with SAMI Workstation EX Software. For automating the sample preparation processes, the robot needs to use various commercial tools, including pipette, syringe, microplate, vial, thermo shaker, ultrasonic machine and so on
    corecore