9 research outputs found
Control visual dinámico basado en FPGA de un robot manipulador de 6 grados de libertad
[Resumen] En este artículo se describe la formulación, implementación y experimentación de un sistema de control visual dinámico aplicado a un robot de 6 grados de libertad. Se propone una arquitectura hardware basada en FPGAs para la implementación de los controladores. Con el objetivo de limitar la latencia del controlador se ha implementado en la FPGA no sólo el controlador sino también la captura y procesamiento de la información visual. Se hace uso de las capacidades de procesamiento paralelo de la FPGA para optimizar los diferentes componentes del sistema de control visual propuesto. Finaliza el artículo con los resultados experimentales obtenidos en tareas de posicionamiento de un robot Mitsubishi PA10 de 6 grados de libertad.Este trabajo ha sido financiado por el Ministerio de Economía y competitividad mediante el proyecto DPI2015-68087-Rhttps://doi.org/10.17979/spudc.978849749808
More Than a Feeling: Learning to Grasp and Regrasp using Vision and Touch
For humans, the process of grasping an object relies heavily on rich tactile
feedback. Most recent robotic grasping work, however, has been based only on
visual input, and thus cannot easily benefit from feedback after initiating
contact. In this paper, we investigate how a robot can learn to use tactile
information to iteratively and efficiently adjust its grasp. To this end, we
propose an end-to-end action-conditional model that learns regrasping policies
from raw visuo-tactile data. This model -- a deep, multimodal convolutional
network -- predicts the outcome of a candidate grasp adjustment, and then
executes a grasp by iteratively selecting the most promising actions. Our
approach requires neither calibration of the tactile sensors, nor any
analytical modeling of contact forces, thus reducing the engineering effort
required to obtain efficient grasping policies. We train our model with data
from about 6,450 grasping trials on a two-finger gripper equipped with GelSight
high-resolution tactile sensors on each finger. Across extensive experiments,
our approach outperforms a variety of baselines at (i) estimating grasp
adjustment outcomes, (ii) selecting efficient grasp adjustments for quick
grasping, and (iii) reducing the amount of force applied at the fingers, while
maintaining competitive performance. Finally, we study the choices made by our
model and show that it has successfully acquired useful and interpretable
grasping behaviors.Comment: 8 pages. Published on IEEE Robotics and Automation Letters (RAL).
Website: https://sites.google.com/view/more-than-a-feelin
Manipulation of unknown objects to improve the grasp quality using tactile information
This work presents a novel and simple approach in the area of manipulation of unknown objects considering both geometric and mechanical constraints of the robotic hand. Starting with an initial blind grasp, our method improves the grasp quality through manipulation considering the three common goals of the manipulation process: improving the hand configuration, the grasp quality and the object positioning, and, at the same time, prevents the object from falling. Tactile feedback is used to obtain local information of the contacts between the fingertips and the object, and no additional exteroceptive feedback sources are considered in the approach. The main novelty of this work lies in the fact that the grasp optimization is performed on-line as a reactive procedure using the tactile and kinematic information obtained during the manipulation. Experimental results are shown to illustrate the efficiency of the approachPeer ReviewedPostprint (published version
Tactile Sensing and Control of Robotic Manipulator Integrating Fiber Bragg Grating Strain-Sensor
Tactile sensing is an instrumental modality of robotic manipulation, as it provides information that is not accessible via remote sensors such as cameras or lidars. Touch is particularly crucial in unstructured environments, where the robot's internal representation of manipulated objects is uncertain. In this study we present the sensorization of an existing artificial hand, with the aim to achieve fine control of robotic limbs and perception of object's physical properties. Tactile feedback is conveyed by means of a soft sensor integrated at the fingertip of a robotic hand. The sensor consists of an optical fiber, housing Fiber Bragg Gratings (FBGs) transducers, embedded into a soft polymeric material integrated on a rigid hand. Through several tasks involving grasps of different objects in various conditions, the ability of the system to acquire information is assessed. Results show that a classifier based on the sensor outputs of the robotic hand is capable of accurately detecting both size and rigidity of the operated objects (99.36 and 100% accuracy, respectively). Furthermore, the outputs provide evidence of the ability to grab fragile objects without breakage or slippage e and to perform dynamic manipulative tasks, that involve the adaptation of fingers position based on the grasped objects' condition
Tactile Sensing and Control of Robotic Manipulator Integrating Fiber Bragg Grating Strain-Sensor
Tactile sensing is an instrumental modality of robotic manipulation, as it provides information that is not accessible via remote sensors such as cameras or lidars. Touch is particularly crucial in unstructured environments, where the robot's internal representation of manipulated objects is uncertain. In this study we present the sensorization of an existing artificial hand, with the aim to achieve fine control of robotic limbs and perception of object's physical properties. Tactile feedback is conveyed by means of a soft sensor integrated at the fingertip of a robotic hand. The sensor consists of an optical fiber, housing Fiber Bragg Gratings (FBGs) transducers, embedded into a soft polymeric material integrated on a rigid hand. Through several tasks involving grasps of different objects in various conditions, the ability of the system to acquire information is assessed. Results show that a classifier based on the sensor outputs of the robotic hand is capable of accurately detecting both size and rigidity of the operated objects (99.36 and 100% accuracy, respectively). Furthermore, the outputs provide evidence of the ability to grab fragile objects without breakage or slippage e and to perform dynamic manipulative tasks, that involve the adaptation of fingers position based on the grasped objects' condition
Control framework for dexterous manipulation using dynamic visual servoing and tactile sensors’ feedback
Tactile sensors play an important role in robotics manipulation to perform dexterous and complex tasks. This paper presents a novel control framework to perform dexterous manipulation with multi-fingered robotic hands using feedback data from tactile and visual sensors. This control framework permits the definition of new visual controllers which allow the path tracking of the object motion taking into account both the dynamics model of the robot hand and the grasping force of the fingertips under a hybrid control scheme. In addition, the proposed general method employs optimal control to obtain the desired behaviour in the joint space of the fingers based on an indicated cost function which determines how the control effort is distributed over the joints of the robotic hand. Finally, authors show experimental verifications on a real robotic manipulation system for some of the controllers derived from the control framework.This work was funded by the Spanish Ministry of Economy, the European FEDER funds and the Valencia Regional Government, through the research projects DPI2012-32390 and PROMETEO/2013/085
Control Framework for Dexterous Manipulation Using Dynamic Visual Servoing and Tactile Sensors’ Feedback
Tactile sensors play an important role in robotics manipulation to perform dexterous and complex tasks. This paper presents a novel control framework to perform dexterous manipulation with multi-fingered robotic hands using feedback data from tactile and visual sensors. This control framework permits the definition of new visual controllers which allow the path tracking of the object motion taking into account both the dynamics model of the robot hand and the grasping force of the fingertips under a hybrid control scheme. In addition, the proposed general method employs optimal control to obtain the desired behaviour in the joint space of the fingers based on an indicated cost function which determines how the control effort is distributed over the joints of the robotic hand. Finally, authors show experimental verifications on a real robotic manipulation system for some of the controllers derived from the control framework