51 research outputs found

    Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain

    Get PDF
    Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors

    Preliminary variation on multiview geometry for vision-guided laser surgery.

    No full text
    International audienceThis paper proposes to use the multiview geometry to control an orientable laser beam for surgery. Two methods are proposed based on the analogy between a scanning laser beam and a camera: the first method uses one camera and the laser scanner as a virtual camera to form a virtual stereoscopic system while the second method uses two cameras to form a virtual trifocal system. Using the associated epipolar or trifocal geometry, two control laws are derived without any matrix inversion nor estimation of the 3D scene. It is shown that the more geometry is used, the simpler the control gets. These control laws show, as expected, exponential convergence in simulation validation

    A visual servoing path-planning strategy for cameras obeying the unified model

    Get PDF
    Part of 2010 IEEE Multi-Conference on Systems and ControlRecently, a unified camera model has been introduced in visual control systems in order to describe through a unique mathematical model conventional perspective cameras, fisheye cameras, and catadioptric systems. In this paper, a path-planning strategy for visual servoing is proposed for any camera obeying this unified model. The proposed strategy is based on the projection onto a virtual plane of the available image projections. This has two benefits. First, it allows one to perform camera pose estimation and 3D object reconstruction by using methods for conventional camera that are not valid for other cameras. Second, it allows one to perform image pathplanning for multi-constraint satisfaction by using a simplified but equivalent projection model, that in this paper is addressed by introducing polynomial parametrizations of the rotation and translation. The planned image trajectory is hence tracked by using an IBVS controller. The proposed strategy is validated through simulations with image noise and calibration errors typical of real experiments. It is worth remarking that visual servoing path-planning for non conventional perspective cameras has not been proposed yet in the literature. © 2010 IEEE.published_or_final_versionThe 2010 IEEE International Symposium on Computer-Aided Control System Design (CACSD), Yokohama, Japan, 8-10 September 2010. In Proceedings of CACSD, 2010, p. 1795-180

    2 1/2 D Visual servoing with respect to unknown objects through a new estimation scheme of camera displacement

    Get PDF
    Abstract. Classical visual servoing techniques need a strong a priori knowledge of the shape and the dimensions of the observed objects. In this paper, we present how the 2 1/2 D visual servoing scheme we have recently developed, can be used with unknown objects characterized by a set of points. Our scheme is based on the estimation of the camera displacement from two views, given by the current and desired images. Since vision-based robotics tasks generally necessitate to be performed at video rate, we focus only on linear algorithms. Classical linear methods are based on the computation of the essential matrix. In this paper, we propose a different method, based on the estimation of the homography matrix related to a virtual plane attached to the object. We show that our method provides a more stable estimation when the epipolar geometry degenerates. This is particularly important in visual servoing to obtain a stable control law, especially near the convergence of the system. Finally, experimental results confirm the improvement in the stability, robustness, and behaviour of our scheme with respect to classical methods. Keywords: visual servoing, projective geometry, homography 1

    Simulation of Visual Servoing in Grasping Objects Moving by Newtonian Dynamics

    Get PDF
    Robot control systems and other manufacturing equipment are traditionally closed systems. This circumstance has hampered system integration of manipulators, sensors as well as other equipment, and such system integration has often been made at an unsuitably high hierarchical level. With the aid of vision, visual feedback is used to guide the robot manipulator to the target. This hand-to-target task is fairly easy if the target is static in Cartesian space. However, if the target is dynamic in motion, a model of the dynamics behaviour is required in order for the robot to track and intercept the target. The purpose of this project is to simulate in a virtual environment to show how to organise robot control systems with sensor integration. This project is a simulation that involves catching a thrown virtual ball using a six degree-of-freedom virtual robot and two virtual digital cameras. Tasks to be executed in this project include placement of virtual digital cameras, segmentation and tracking of the moving virtual ball as well as model-based prediction of the virtual ball's trajectory. Consideration have to be given to the placement of the virtual digital cameras so that the whole trajectory of the ball can be captured by both the virtual digital cameras simultaneously. In order to track the trajectory of the virtual ball, the image of the ball captured by the digital cameras has to be segmented from its background. Then a model is to be developed to predict the trajectory of the virtual ball so that the virtual robot can be controlled to align itself to grasp the moving virtual ball

    Development of Sensory-Motor Fusion-Based Manipulation and Grasping Control for a Robotic Hand-Eye System

    Get PDF
    • 

    corecore