2,317 research outputs found

    Dynamic Speed and Separation Monitoring with On-Robot Ranging Sensor Arrays for Human and Industrial Robot Collaboration

    Get PDF
    This research presents a flexible and dynamic implementation of Speed and Separation Monitoring (SSM) safety measure that optimizes the productivity of a task while ensuring human safety during Human-Robot Collaboration (HRC). Unlike the standard static/fixed demarcated 2D safety zones based on 2D scanning LiDARs, this research presents a dynamic sensor setup that changes the safety zones based on the robot pose and motion. The focus of this research is the implementation of a dynamic SSM safety configuration using Time-of-Flight (ToF) laser-ranging sensor arrays placed around the centers of the links of a robot arm. It investigates the viability of on-robot exteroceptive sensors for implementing SSM as a safety measure. Here the implementation of varying dynamic SSM safety configurations based on approaches of measuring human-robot separation distance and relative speeds using the sensor modalities of ToF sensor arrays, a motion-capture system, and a 2D LiDAR is shown. This study presents a comparative analysis of the dynamic SSM safety configurations in terms of safety, performance, and productivity. A system of systems (cyber-physical system) architecture for conducting and analyzing the HRC experiments was proposed and implemented. The robots, objects, and human operators sharing the workspace are represented virtually as part of the system by using a digital-twin setup. This system was capable of controlling the robot motion, monitoring human physiological response, and tracking the progress of the collaborative task. This research conducted experiments with human subjects performing a task while sharing the robot workspace under the proposed dynamic SSM safety configurations. The experiment results showed a preference for the use of ToF sensors and motion capture rather than the 2D LiDAR currently used in the industry. The human subjects felt safe and comfortable using the proposed dynamic SSM safety configuration with ToF sensor arrays. The results for a standard pick and place task showed up to a 40% increase in productivity in comparison to a 2D LiDAR

    Working together: a review on safe human-robot collaboration in industrial environments

    Get PDF
    After many years of rigid conventional procedures of production, industrial manufacturing is going through a process of change toward flexible and intelligent manufacturing, the so-called Industry 4.0. In this paper, human-robot collaboration has an important role in smart factories since it contributes to the achievement of higher productivity and greater efficiency. However, this evolution means breaking with the established safety procedures as the separation of workspaces between robot and human is removed. These changes are reflected in safety standards related to industrial robotics since the last decade, and have led to the development of a wide field of research focusing on the prevention of human-robot impacts and/or the minimization of related risks or their consequences. This paper presents a review of the main safety systems that have been proposed and applied in industrial robotic environments that contribute to the achievement of safe collaborative human-robot work. Additionally, a review is provided of the current regulations along with new concepts that have been introduced in them. The discussion presented in this paper includes multidisciplinary approaches, such as techniques for estimation and the evaluation of injuries in human-robot collisions, mechanical and software devices designed to minimize the consequences of human-robot impact, impact detection systems, and strategies to prevent collisions or minimize their consequences when they occur

    An Integrated Camera and Radar on-Robot System for Human Robot Collaboration

    Get PDF
    The increased demand for collaborative tasks between humans and robots has caused an upsurge in newer sensor technologies to detect, locate, track, and monitor workers in a robot workspace. The challenge is to balance the accuracy, cost, and responsiveness of the system to maximize the safety of the worker. This work presents a sensor system that combines six 60GHz radar modules and six cameras to accurately track the location and speed of the workers in all 360 degrees around the robot. While the radar is tuned to identify moving targets, the cameras perform pose detection to evaluate the humans in the workspace and when fused, provide 4D pose estimates: 3D location and velocity. A custom PCB and enclosure is designed for it and it is mounted to the end-effector of a UR-10 robot. This system performs all of its computation on an Nvidia AGX Xavier for offline processing which allows it to be mounted to a mobile robot for outdoor use. Lastly, this system was evaluated for accuracy in human detection as well as accuracy in velocity measurements through numerous static and dynamic scenarios for the robot, the human, and both combined

    Human robot collaboration in the MTA SZTAKI learning factory facility at Gyor

    Get PDF
    In recent years, interest has grown in environments where humans and robots collaborate, complementing the strengths and advantages of humans and machines. Design, construction and adjustment of such environments, as well as the training of operating personnel, requires thorough understanding of the nature of human robot collaboration which previous automation expertise does not necessarily provide. The learning factory currently being constructed by MTA SZTAKI in Gyor aims to provide hands-on experience in the design and operation of facilities supporting human robot collaboration, mainly in assembly tasks. The work-in progress paper presents design principles, functionalities and structure of the facility, and outlines deployment plans in education, training, research and development in the academic and industrial sectors. (C) 2018 The Authors. Published by Elsevier B.V

    Experimental analysis of using radar as an extrinsic sensor for human-robot collaboration

    Get PDF
    Collaborative robots are expected to be an integral part of driving the fourth industrial revolution that is soon expected to happen. In human-robot collaboration, the robot and a human share a common workspace and work on a common task. Here the safety of a human working with the robot is of utmost importance. A collaborative robot usually consists of various sensors to ensure the safety of a human working with the robot. This research mainly focuses on establishing a safe environment for a human working alongside a robot by mounting an FMCW radar as an extrinsic sensor, through which the workspace of the robot is monitored. A customized tracking algorithm is developed for the sensor used in this study by including a dynamically varying gating threshold, and information about consecutive missed detections to track and localize the human around the workspace of the robot. The performance of the proposed system in successfully establishing a safe human-robot collaboration is examined across a few scenarios that arise when a single human operator is working alongside a robot, with the radar operating in different modes. An OptiTrack Motion Capture System is used as ground truth to validate the efficacy of the proposed system

    Human-robot coexistence and interaction in open industrial cells

    Get PDF
    Recent research results on human\u2013robot interaction and collaborative robotics are leaving behind the traditional paradigm of robots living in a separated space inside safety cages, allowing humans and robot to work together for completing an increasing number of complex industrial tasks. In this context, safety of the human operator is a main concern. In this paper, we present a framework for ensuring human safety in a robotic cell that allows human\u2013robot coexistence and dependable interaction. The framework is based on a layered control architecture that exploits an effective algorithm for online monitoring of relative human\u2013robot distance using depth sensors. This method allows to modify in real time the robot behavior depending on the user position, without limiting the operative robot workspace in a too conservative way. In order to guarantee redundancy and diversity at the safety level, additional certified laser scanners monitor human\u2013robot proximity in the cell and safe communication protocols and logical units are used for the smooth integration with an industrial software for safe low-level robot control. The implemented concept includes a smart human-machine interface to support in-process collaborative activities and for a contactless interaction with gesture recognition of operator commands. Coexistence and interaction are illustrated and tested in an industrial cell, in which a robot moves a tool that measures the quality of a polished metallic part while the operator performs a close evaluation of the same workpiece

    A Time of Flight on-Robot Proximity Sensing System for Collaborative Robotics

    Get PDF
    The sensor system presented in this work demonstrates the results of designing an industrial grade exteroceptive sensing device for proximity sensing for collaborative robots. The intention of this design\u27s application is to develop an on-robot small footprint proximity sensing device to prevent safety protected stops from halting a robot during a manufacturing process. Additionally, this system was design to be modular and fit on an size or shape robotic link expanding the sensor system\u27s use cases vastly. The design was assembled and put through a number of benchmark tests to validate the performance of the time of flight (ToF) sensor system when used in proximity sensing: Single Sensor Characterization, Sensor Overlap Characterization, and Sensor Ranging Under Motion. Through these tests, the ToF sensor ring achieves real time data throughput while minimizing blind spots. Lastly, the sensor system was tested at a maximum throughput load of 32 ToF sensors and maintained a stable throughput of data from all sensors in real time

    Human-Robot Collaborations in Industrial Automation

    Get PDF
    Technology is changing the manufacturing world. For example, sensors are being used to track inventories from the manufacturing floor up to a retail shelf or a customer’s door. These types of interconnected systems have been called the fourth industrial revolution, also known as Industry 4.0, and are projected to lower manufacturing costs. As industry moves toward these integrated technologies and lower costs, engineers will need to connect these systems via the Internet of Things (IoT). These engineers will also need to design how these connected systems interact with humans. The focus of this Special Issue is the smart sensors used in these human–robot collaborations

    Perception Methods For Speed And Separation Monitoring Using Time-of-Flight Sensor Arrays

    Get PDF
    This work presents the development of a perception pipeline to passively track the partial human ground pose in the context of human robot collaboration. The main motivation behind this work is to provide a speed and separation monitoring based safety controller with an estimate of human position on the factory floor. Three time-of-flight sensing rings affixed to the major links of an industrial manipulator are used to implement the aforementioned. Along with a convolutional neural network based unknown obstacle detection strategy, the ground position of the human operator is estimated and tracked using sparse 3-D point inputs. Experiments to analyze the viability of our approach are presented in depth in the further sections which involve real-world and synthetic datasets. Ultimately, it is shown that the sensing system can provide reliable information intermittently and can be used for higher level perception schemes
    • …
    corecore