5 research outputs found

    Hand-Gesture Based Programming of Industrial Robot Manipulators

    Get PDF
    Nowadays, industrial robot manipulators and manufacturing processes are associated as never before. Robot manipulators execute repetitive tasks with increased accuracy and speed, features necessary for industries with needs for manufacturing of products in large quantities by reducing the production time. Although robot manipulators have a significant role for the enhancement of productivity within industries, the programming process of the robot manipulators is an important drawback. Traditional programming methodologies requires robot programming experts and are time consuming. This thesis work aims to develop an application for programming industrial robot manipulators excluding the need of traditional programing methodologies exploiting the intuitiveness of humans’ hands’ gestures. The development of input devices for intuitive Human-Machine Interactions provides the possibility to capture such gestures. Hence, the need of the need of robot manipulator programming experts can be replaced by task experts. In addition, the integration of intuitive means of interaction can reduce be also reduced. The components to capture the hands’ operators’ gestures are a data glove and a precise hand-tracking device. The robot manipulator imitates the motion that human operator performs with the hand, in terms of position. Inverse kinematics are applied to enhance the programming of robot manipulators in-dependently of their structure and manufacturer and researching the possibility for optimizing the programmed robot paths. Finally, a Human-Machine Interface contributes in the programming process by offering important information for the programming process and the status of the integrated components

    Multi-modal interface for offline robot programming

    Get PDF
    This thesis presents an approach for improving robot offline programming using input methods based on the human natural skills. The approach is focused to teach basic assembly and manipulation operations using a pair of industrial robots in an existing simulation environment and is meant to be improved in future works, that are also proposed in this thesis. In order to develop this approach, and regarding the available resources, an Add-In for the simulation and offline programming software RobotStudio was developed. This Add-In combines human pose, a graphical user interface and optionally speech to teach the robot a sequence of targets, along with the simulation environment, to automatically generate instructions. Two different kinds of sensors, Kinect and Leap Motion Sensor have been evaluated based on references in order to select the most suitable one for the implementation of this work. The executions of the programmed instructions have been evaluated in simulation.Este trabajo presenta una propuesta para mejorar al programación de robots fuera de línea usando métodos de entrada basados en habilidades humanas naturales. La propuesta se enfoca en enseñar operaciones básicas de ensamblaje y manipulación, utilizando un par de robots industriales en un entorno de simulación ya existente y se dispone para ser mejorado en trabajos futuros, los cuales también se proponen en este trabajo. Con el fin de desarrollar esta propuesta y teniendo en cuenta los recursos disponibles, se ha desarrollado un Add-In para el programa de simulación y programación fuera de línea Robot Studio. Este Add-In combina pose humana, una interfaz gráfica de usuario y opcionalmente habla para enseñar al robot una secuencia de objetivos junto con el entorno de simulación para automáticamente generar instrucciones. Dos diferentes tipos de sensores, Kinect y Leap Motion Sensor han sido evaluados en base a referencias para seleccionar el más adecuado para la implementación de este trabajo. Las ejecuciones de las instrucciones programadas han sido evaluadas en simulación.Máster Universitario en Ingeniería Industrial (M141

    Mobile Robots Navigation

    Get PDF
    Mobile robots navigation includes different interrelated activities: (i) perception, as obtaining and interpreting sensory information; (ii) exploration, as the strategy that guides the robot to select the next direction to go; (iii) mapping, involving the construction of a spatial representation by using the sensory information perceived; (iv) localization, as the strategy to estimate the robot position within the spatial map; (v) path planning, as the strategy to find a path towards a goal location being optimal or not; and (vi) path execution, where motor actions are determined and adapted to environmental changes. The book addresses those activities by integrating results from the research work of several authors all over the world. Research cases are documented in 32 chapters organized within 7 categories next described

    Robot Competition Using Gesture Based Interface

    No full text
    corecore