15 research outputs found

    Visual Tactile Sensor Based Force Estimation for Position-Force Teleoperation

    Full text link
    Vision-based tactile sensors have gained extensive attention in the robotics community. The sensors are highly expected to be capable of extracting contact information i.e. haptic information during in-hand manipulation. This nature of tactile sensors makes them a perfect match for haptic feedback applications. In this paper, we propose a contact force estimation method using the vision-based tactile sensor DIGIT, and apply it to a position-force teleoperation architecture for force feedback. The force estimation is done by building a depth map for DIGIT gel surface deformation measurement and applying a regression algorithm on estimated depth data and ground truth force data to get the depth-force relationship. The experiment is performed by constructing a grasping force feedback system with a haptic device as a leader robot and a parallel robot gripper as a follower robot, where the DIGIT sensor is attached to the tip of the robot gripper to estimate the contact force. The preliminary results show the capability of using the low-cost vision-based sensor for force feedback applications.Comment: IEEE CBS 202

    Of Priors and Particles: Structured and Distributed Approaches to Robot Perception and Control

    Get PDF
    Applications of robotic systems have expanded significantly in their scope, moving beyond the caged predictability of industrial automation and towards more open, unstructured environments. These agents must learn to reliably perceive their surroundings, efficiently integrate new information and quickly adapt to dynamic perturbations. To accomplish this, we require solutions which can effectively incorporate prior knowledge while maintaining the generality of learned representations. These systems must also contend with uncertainty in both their perception of the world and in predicting possible future outcomes. Efficient methods for probabilistic inference are then key to realizing robust, adaptive behavior. This thesis will first examine data-driven approaches for learning and combining perceptual models for both visual and tactile sensor modalities, common in robotics. Modern variational inference methods will then be examined in the context of online optimization and stochastic optimal control. Specifically, this thesis will contribute (1) data-driven visual and tactile perceptual models leveraging kinematic and dynamic priors, (2) a framework for joint inference with visuo-tactile sensing, (3) a family of particle-based, variational model predictive control and planning algorithms, and (4) a distributed inference scheme for online model adaptation.Ph.D
    corecore