181 research outputs found

    Visual Servoing in Robotics

    Get PDF
    Visual servoing is a well-known approach to guide robots using visual information. Image processing, robotics, and control theory are combined in order to control the motion of a robot depending on the visual information extracted from the images captured by one or several cameras. With respect to vision issues, a number of issues are currently being addressed by ongoing research, such as the use of different types of image features (or different types of cameras such as RGBD cameras), image processing at high velocity, and convergence properties. As shown in this book, the use of new control schemes allows the system to behave more robustly, efficiently, or compliantly, with fewer delays. Related issues such as optimal and robust approaches, direct control, path tracking, or sensor fusion are also addressed. Additionally, we can currently find visual servoing systems being applied in a number of different domains. This book considers various aspects of visual servoing systems, such as the design of new strategies for their application to parallel robots, mobile manipulators, teleoperation, and the application of this type of control system in new areas

    Visual Servoing

    Get PDF
    The goal of this book is to introduce the visional application by excellent researchers in the world currently and offer the knowledge that can also be applied to another field widely. This book collects the main studies about machine vision currently in the world, and has a powerful persuasion in the applications employed in the machine vision. The contents, which demonstrate that the machine vision theory, are realized in different field. For the beginner, it is easy to understand the development in the vision servoing. For engineer, professor and researcher, they can study and learn the chapters, and then employ another application method

    Deep Forward and Inverse Perceptual Models for Tracking and Prediction

    Full text link
    We consider the problems of learning forward models that map state to high-dimensional images and inverse models that map high-dimensional images to state in robotics. Specifically, we present a perceptual model for generating video frames from state with deep networks, and provide a framework for its use in tracking and prediction tasks. We show that our proposed model greatly outperforms standard deconvolutional methods and GANs for image generation, producing clear, photo-realistic images. We also develop a convolutional neural network model for state estimation and compare the result to an Extended Kalman Filter to estimate robot trajectories. We validate all models on a real robotic system.Comment: 8 pages, International Conference on Robotics and Automation (ICRA) 201

    High-precision grasping and placing for mobile robots

    Get PDF
    This work presents a manipulation system for multiple labware in life science laboratories using the H20 mobile robots. The H20 robot is equipped with the Kinect V2 sensor to identify and estimate the position of the required labware on the workbench. The local features recognition based on SURF algorithm is used. The recognition process is performed for the labware to be grasped and for the workbench holder. Different grippers and labware containers are designed to manipulate different weights of labware and to realize a safe transportation

    Design and modeling of a stair climber smart mobile robot (MSRox)

    Full text link

    Modified System Design and Implementation of an Intelligent Assistive Robotic Manipulator

    Get PDF
    This thesis presents three improvements to the current UCF MANUS systems. The first improvement modifies the existing fine motion controller into PI controller that has been optimized to prevent the object from leaving the view of the cameras used for visual servoing. This is achieved by adding a weight matrix to the proportional part of the controller that is constrained by an artificial ROI. When the feature points being used are approaching the boundaries of the ROI, the optimized controller weights are calculated using quadratic programming and added to the nominal proportional gain portion of the controller. The second improvement was a compensatory gross motion method designed to ensure that the desired object can be identified. If the object cannot be identified after the initial gross motion, the end-effector will then be moved to one of three different locations around the object until the object is identified or all possible positions are checked. This framework combines the Kanade-Lucase-Tomasi local tracking method with the ferns global detector/tracker to create a method that utilizes the strengths of both systems to overcome their inherent weaknesses. The last improvement is a particle-filter based tracking algorithm that robustifies the visual servoing function of fine motion. This method performs better than the current global detector/tracker that was being implemented by allowing the tracker to successfully track the object in complex environments with non-ideal conditions

    High-Speed Vision and Force Feedback for Motion-Controlled Industrial Manipulators

    Get PDF
    Over the last decades, both force sensors and cameras have emerged as useful sensors for different applications in robotics. This thesis considers a number of dynamic visual tracking and control problems, as well as the integration of these techniques with contact force control. Different topics ranging from basic theory to system implementation and applications are treated. A new interface developed for external sensor control is presented, designed by making non-intrusive extensions to a standard industrial robot control system. The structure of these extensions are presented, the system properties are modeled and experimentally verified, and results from force-controlled stub grinding and deburring experiments are presented. A novel system for force-controlled drilling using a standard industrial robot is also demonstrated. The solution is based on the use of force feedback to control the contact forces and the sliding motions of the pressure foot, which would otherwise occur during the drilling phase. Basic methods for feature-based tracking and servoing are presented, together with an extension for constrained motion estimation based on a dual quaternion pose parametrization. A method for multi-camera real-time rigid body tracking with time constraints is also presented, based on an optimal selection of the measured features. The developed tracking methods are used as the basis for two different approaches to vision/force control, which are illustrated in experiments. Intensity-based techniques for tracking and vision-based control are also developed. A dynamic visual tracking technique based directly on the image intensity measurements is presented, together with new stability-based methods suitable for dynamic tracking and feedback problems. The stability-based methods outperform the previous methods in many situations, as shown in simulations and experiments

    Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain

    Get PDF
    Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors

    Workshop on "Control issues in the micro / nano - world".

    No full text
    International audienceDuring the last decade, the need of systems with micro/nanometers accuracy and fast dynamics has been growing rapidly. Such systems occur in applications including 1) micromanipulation of biological cells, 2) micrassembly of MEMS/MOEMS, 3) micro/nanosensors for environmental monitoring, 4) nanometer resolution imaging and metrology (AFM and SEM). The scale and requirement of such systems present a number of challenges to the control system design that will be addressed in this workshop. Working in the micro/nano-world involves displacements from nanometers to tens of microns. Because of this precision requirement, environmental conditions such as temperature, humidity, vibration, could generate noise and disturbance that are in the same range as the displacements of interest. The so-called smart materials, e.g., piezoceramics, magnetostrictive, shape memory, electroactive polymer, have been used for actuation or sensing in the micro/nano-world. They allow high resolution positioning as compared to hinges based systems. However, these materials exhibit hysteresis nonlinearity, and in the case of piezoelectric materials, drifts (called creep) in response to constant inputs In the case of oscillating micro/nano-structures (cantilever, tube), these nonlinearities and vibrations strongly decrease their performances. Many MEMS and NEMS applications involve gripping, feeding, or sorting, operations, where sensor feedback is necessary for their execution. Sensors that are readily available, e.g., interferometer, triangulation laser, and machine vision, are bulky and expensive. Sensors that are compact in size and convenient for packaging, e.g., strain gage, piezoceramic charge sensor, etc., have limited performance or robustness. To account for these difficulties, new control oriented techniques are emerging, such as[d the combination of two or more ‘packageable' sensors , the use of feedforward control technique which does not require sensors, and the use of robust controllers which account the sensor characteristics. The aim of this workshop is to provide a forum for specialists to present and overview the different approaches of control system design for the micro/nano-world and to initiate collaborations and joint projects
    corecore