1 research outputs found

    Visual estimation of articulated objects configuration during manipulation with a humanoid

    No full text
    International audienceRobotic manipulation tasks require on-line knowledge of the operated objects' configuration. Thus, we need to estimate online the state of the (articulated) objects that are not equipped with positioning sensors. This estimated state w.r.t the robot control frame is required by our controller to update the model and close the loop. Indeed, in the controller we use the models of the (articulated) objects as additional 'robots' so that it computes the overall 'robots-objects' augmented system's motion and contact interaction forces that fulfill all the limitation constraints together with the physics. Because of the uncertainties due to the floating-base nature of humanoids, we address the problem of estimating the configuration of articulated objects using a virtual visual servoing-based approach. Experimental results obtained with the humanoid robot HRP-4 manipulating the paper drawer of a printer show the effectiveness of the approach
    corecore