1,135 research outputs found

    Human-robot interaction for assistive robotics

    Get PDF
    This dissertation presents an in-depth study of human-robot interaction (HRI) withapplication to assistive robotics. In various studies, dexterous in-hand manipulation is included, assistive robots for Sit-To-stand (STS) assistance along with the human intention estimation. In Chapter 1, the background and issues of HRI are explicitly discussed. In Chapter 2, the literature review introduces the recent state-of-the-art research on HRI, such as physical Human-Robot Interaction (HRI), robot STS assistance, dexterous in hand manipulation and human intention estimation. In Chapter 3, various models and control algorithms are described in detail. Chapter 4 introduces the research equipment. Chapter 5 presents innovative theories and implementations of HRI in assistive robotics, including a general methodology of robotic assistance from the human perspective, novel hardware design, robotic sit-to-stand (STS) assistance, human intention estimation, and control

    Leveraging Kernelized Synergies on Shared Subspace for Precision Grasping and Dexterous Manipulation

    Get PDF
    Manipulation in contrast to grasping is a trajectorial task that needs to use dexterous hands. Improving the dexterity of robot hands, increases the controller complexity and thus requires to use the concept of postural synergies. Inspired from postural synergies, this research proposes a new framework called kernelized synergies that focuses on the re-usability of same subspace for precision grasping and dexterous manipulation. In this work, the computed subspace of postural synergies is parameterized by kernelized movement primitives to preserve its grasping and manipulation characteristics and allows its reuse for new objects. The grasp stability of proposed framework is assessed with the force closure quality index, as a cost function. For performance evaluation, the proposed framework is initially tested on two different simulated robot hand models using the Syngrasp toolbox and experimentally, four complex grasping and manipulation tasks are performed and reported. Results confirm the hand agnostic approach of proposed framework and its generalization to distinct objects irrespective of their dimensions

    Bio-Inspired Grasping Controller for Sensorized 2-DoF Grippers

    Full text link
    We present a holistic grasping controller, combining free-space position control and in-contact force-control for reliable grasping given uncertain object pose estimates. Employing tactile fingertip sensors, undesired object displacement during grasping is minimized by pausing the finger closing motion for individual joints on first contact until force-closure is established. While holding an object, the controller is compliant with external forces to avoid high internal object forces and prevent object damage. Gravity as an external force is explicitly considered and compensated for, thus preventing gravity-induced object drift. We evaluate the controller in two experiments on the TIAGo robot and its parallel-jaw gripper proving the effectiveness of the approach for robust grasping and minimizing object displacement. In a series of ablation studies, we demonstrate the utility of the individual controller components

    Modelling Grip Point Selection in Human Precision Grip

    Get PDF

    Optimization and Analysis of Underactuated Linkage Robotic Finger

    Get PDF
    In this study it is required to maximize the transmission performance, which is leading to increase the transmitted torque from the actuated joints to the underactuated joints through transmission mechanism. Accordingly grasping forces in finger phalanges will increase. Studying the four bar mechanism parameters of a specific configuration within defined limits led to the linkage transmission defect parameter, which play a major role in deciding the linkage performance, used as optimization objective function to be minimized. This study presents an optimization procedure carried out using matlab fminunc function, formulated by using Freudenstein's equations to be applied on a (Cassino-Underactuated-Multifinger-Hand) design , using  one finger and a thumb .A mathematical model of grasping forces of the finger were introduced taking into account the solid links in the( Ca.U.M.Ha)  robotic finger . Keywords: Linkage , underactuated, optimizatio

    A framework for compliant physical interaction : the grasp meets the task

    Get PDF
    Although the grasp-task interplay in our daily life is unquestionable, very little research has addressed this problem in robotics. In order to fill the gap between the grasp and the task, we adopt the most successful approaches to grasp and task specification, and extend them with additional elements that allow to define a grasp-task link. We propose a global sensor-based framework for the specification and robust control of physical interaction tasks, where the grasp and the task are jointly considered on the basis of the task frame formalism and the knowledge-based approach to grasping. A physical interaction task planner is also presented, based on the new concept of task-oriented hand pre-shapes. The planner focuses on manipulation of articulated parts in home environments, and is able to specify automatically all the elements of a physical interaction task required by the proposed framework. Finally, several applications are described, showing the versatility of the proposed approach, and its suitability for the fast implementation of robust physical interaction tasks in very different robotic systems

    Influence of Gaze Position on Grasp Parameters For Reaches to Visible and Remembered Stimuli

    Get PDF
    In order to pick up or manipulate a seen object, one must use visual signals to aim and transport the hand to the object’s location (reach), and configure the digits to the shape of the object (grasp). It has been shown that reach and grasp are controlled by separate neural pathways. In real world conditions, however, all of these signals (gaze, reach, grasp) must interact to provide accurate eye-hand coordination. The interactions between gaze, reach, and grasp parameters have not been comprehensively studied in humans. The purpose of the study was to investigate 1) the effect of gaze and target positions on grasp location, amplitude, and orientation, and 2) the influence of visual feedback of the hand and target on the final grasp components and on the spatial deviations associated with gaze direction and target position. Seven subjects reached to grasp a rectangular “virtual” target presented at three orientations, three locations, and with three gaze fixation positions during open- and closed-loop conditions. Participants showed gaze- and target-dependent deviations in grasp parameters that could not be predicted from previous studies. Our results showed that both reach- and grasp-related deviations were affected by stimulus position. The interaction effects of gaze and reach position revealed complex mechanisms, and their impacts were different in each grasp parameter. The impacts of gaze direction on grasp deviation were dependent on target position in space, especially for grasp location and amplitude. Gaze direction had little impact on grasp orientation. Visual feedback about the hand and target modulated the reach- and gaze- related impacts. The results suggest that the brain uses both control signal interactions and sensorimotor strategies to control and plan reach-and-grasp movements

    A Continuous Grasp Representation for the Imitation Learning of Grasps on Humanoid Robots

    Get PDF
    Models and methods are presented which enable a humanoid robot to learn reusable, adaptive grasping skills. Mechanisms and principles in human grasp behavior are studied. The findings are used to develop a grasp representation capable of retaining specific motion characteristics and of adapting to different objects and tasks. Based on the representation a framework is proposed which enables the robot to observe human grasping, learn grasp representations, and infer executable grasping actions
    • …
    corecore