1,125 research outputs found

    Spatial Programming for Industrial Robots through Task Demonstration

    Get PDF
    We present an intuitive system for the programming of industrial robots using markerless gesture recognition and mobile augmented reality in terms of programming by demonstration. The approach covers gesture-based task definition and adaption by human demonstration, as well as task evaluation through augmented reality. A 3D motion tracking system and a handheld device establish the basis for the presented spatial programming system. In this publication, we present a prototype toward the programming of an assembly sequence consisting of several pick-and-place tasks. A scene reconstruction provides pose estimation of known objects with the help of the 2D camera of the handheld. Therefore, the programmer is able to define the program through natural bare-hand manipulation of these objects with the help of direct visual feedback in the augmented reality application. The program can be adapted by gestures and transmitted subsequently to an arbitrary industrial robot controller using a unified interface. Finally, we discuss an application of the presented spatial programming approach toward robot-based welding tasks

    Organizational concepts and interaction between humans and robots in industrial environments

    Get PDF
    This paper is discussing the intuitive interaction with robotic systems and the conceptualisation connected with known organisational problems. In particular, the focus will be on the manufacturing industry with respect to its social dimension. One of the aims is to identify relevant research questions about the possibility of development of safer robot systems in closer human-machine intuitive interaction systems at the manufacturing shop-floor level. We try to contribute to minimize the cognitive and perceptual workload for robot operators in complex working systems. In particular that will be highly relevant when more different robots with different roles and produced by different companies or designers are to be used in the manufacturing industry to a larger extent. The social sciences approach to such technology assessment is of high relevance to understand the dimensions of the intuitive interaction concept

    Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces

    Get PDF
    This paper contributes to a taxonomy of augmented reality and robotics based on a survey of 460 research papers. Augmented and mixed reality (AR/MR) have emerged as a new way to enhance human-robot interaction (HRI) and robotic interfaces (e.g., actuated and shape-changing interfaces). Recently, an increasing number of studies in HCI, HRI, and robotics have demonstrated how AR enables better interactions between people and robots. However, often research remains focused on individual explorations and key design strategies, and research questions are rarely analyzed systematically. In this paper, we synthesize and categorize this research field in the following dimensions: 1) approaches to augmenting reality; 2) characteristics of robots; 3) purposes and benefits; 4) classification of presented information; 5) design components and strategies for visual augmentation; 6) interaction techniques and modalities; 7) application domains; and 8) evaluation strategies. We formulate key challenges and opportunities to guide and inform future research in AR and robotics

    A System for Human-Robot Teaming through End-User Programming and Shared Autonomy

    Full text link
    Many industrial tasks-such as sanding, installing fasteners, and wire harnessing-are difficult to automate due to task complexity and variability. We instead investigate deploying robots in an assistive role for these tasks, where the robot assumes the physical task burden and the skilled worker provides both the high-level task planning and low-level feedback necessary to effectively complete the task. In this article, we describe the development of a system for flexible human-robot teaming that combines state-of-the-art methods in end-user programming and shared autonomy and its implementation in sanding applications. We demonstrate the use of the system in two types of sanding tasks, situated in aircraft manufacturing, that highlight two potential workflows within the human-robot teaming setup. We conclude by discussing challenges and opportunities in human-robot teaming identified during the development, application, and demonstration of our system.Comment: Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI '24), March 11 - 14, 2024, Boulder, CO, US

    Increasing robot autonomy via motion planning and an augmented reality interface

    Get PDF
    Recently, there has been a growing interest in robotic systems that are able to share workspaces and collabo- rate with humans. Such collaborative scenarios require efficient mechanisms to communicate human requests to a robot, as well as to transmit robot interpretations and intents to humans. Recent advances in augmented reality (AR) technologies have provided an alternative for such communication. Nonetheless, most of the existing work in human-robot interaction with AR devices is still limited to robot motion programming or teleoperation. In this paper, we present an alternative approach to command and collaborate with robots. Our approach uses an AR interface that allows a user to specify high-level requests to a robot, to preview, approve or modify the computed robot motions. The proposed approach exploits the robot’s decision- making capabilities instead of requiring low-level motion spec- ifications provided by the user. The latter is achieved by using a motion planner that can deal with high-level goals corresponding to regions in the robot configuration space. We present a proof of concept to validate our approach in different test scenarios, and we present a discussion of its applicability in collaborative environments

    Multiple Robot Simulation in a Virtual Reality Environment

    Get PDF
    Nowadays, robotics is becoming increasingly important in people's daily lives. However, the process of learning and training in robotics is not always easy. In fact, in most cases, proper training is linked to direct interaction with these devices. This is usually not possible for the vast majority of people, as they may not have access to a robot. Nevertheless, thanks to the emergence of different technologies such as Virtual Reality (VR) it is possible to do things that were considered unimaginable before. Therefore, this project aims to make the most of both technologies, creating an alternative way of interacting with robots to understand how they behave, thus flattening the robotics learning curve. To this end, a software that allows the simulation and control of various robots in VR has been developed

    Energy-based control approaches in human-robot collaborative disassembly

    Get PDF
    corecore