1,736 research outputs found

    Safety analysis on human-robot collaboration in heavy assembly task

    Get PDF
    Manufacturing assembly industry has traditionally utilized human labor to perform assembly tasks manually. With the introduction of industrial robots, fully automated solutions have provided an opportunity to perform complex and repetitive tasks and assist in the assembly of heavy components. In recent years, improvement in robot technologies and changes in safety legislation have enabled new human-robot collaboration (HRC) concepts which have drawn attention of manufacturers. HRC uses characteristics of dexterity and flexibility of human and repeatability and precision of robots to increase the flexibility of the system, decrease the cost of labor in production and improve ergonomics in the design of shared workspace. The operator safety is one of the challenges inside the HRC environment. The safety concerns could be altered with different levels of physical interactions between robot and human. This thesis aimed to develop solution for analyzing the safety functions on different human-robot interaction (HRI) levels. The approach was started with the classification of tasks between human and robot. In this thesis, assembly sequences were designed to fulfill the requirements of each interaction levels of HRI. These experiments were providing evaluation tables for analyzing the safety functions in HRI levels. The primary objective of this thesis is to design the HRC system with suitable safety functions. The safety of the workstation was developed using a combination of hardware and software. Laser scanners employed to detect the presence of a human in hazard areas and ABB SafeMove add-on were configured to exploit safety signals to the robot controller for adopting safety functions such as safety-rated monitored stop, and speed and separation monitoring. In this thesis, time work study analysis was demonstrated that the implementation of HRC decreases the fatigue and the injury risks of the operator and enhances the ergonomics for the operators. The study of safety functions through different HRI levels proved that with an increase of physical interactions it was necessary to employ multiple safety functions to prohibit collisions between robot and human

    Autonomous Pick-and-Place Procedure with an Industrial Robot Using Multiple 3D Sensors for Object Detection and Obstacle Avoidance

    Get PDF
    Master's thesis in Mechatronics (MAS500)This thesis proposes a full pipeline autonomous pick-and-place procedure, integrating perception, planning, grasping and control for execution of tasks towards long term industrial automation. Within perception, we demonstrate the detection of a large object (target) including position and orientation (pose) estimation in 3D world. Further on, obstacles in the work area are mapped with proposed filtering prior to motion planning and navigation of an industrial robot to the target’s pose. The target is then picked using a custom built motorized 3D printed end gripper, and placed at a desired location in the robot’s reachable environment. Point cloud based model-free obstacle avoidance is performed throughout the whole process. The complete pipeline is targeted towards typical tasks in various industries including offshore, logistics and warehouse domain with scanning of the scene, picking and placing of a bulky object from one position to another without or with minimal human intervention

    Smart Technologies for Precision Assembly

    Get PDF
    This open access book constitutes the refereed post-conference proceedings of the 9th IFIP WG 5.5 International Precision Assembly Seminar, IPAS 2020, held virtually in December 2020. The 16 revised full papers and 10 revised short papers presented together with 1 keynote paper were carefully reviewed and selected from numerous submissions. The papers address topics such as assembly design and planning; assembly operations; assembly cells and systems; human centred assembly; and assistance methods in assembly

    Collaborative human-machine interfaces for mobile manipulators.

    Get PDF
    The use of mobile manipulators in service industries as both agents in physical Human Robot Interaction (pHRI) and for social interactions has been on the increase in recent times due to necessities like compensating for workforce shortages and enabling safer and more efficient operations amongst other reasons. Collaborative robots, or co-bots, are robots that are developed for use with human interaction through direct contact or close proximity in a shared space with the human users. The work presented in this dissertation focuses on the design, implementation and analysis of components for the next-generation collaborative human machine interfaces (CHMI) needed for mobile manipulator co-bots that can be used in various service industries. The particular components of these CHMI\u27s that are considered in this dissertation include: Robot Control: A Neuroadaptive Controller (NAC)-based admittance control strategy for pHRI applications with a co-bot. Robot state estimation: A novel methodology and placement strategy for using arrays of IMUs that can be embedded in robot skin for pose estimation in complex robot mechanisms. User perception of co-bot CHMI\u27s: Evaluation of human perceptions of usefulness and ease of use of a mobile manipulator co-bot in a nursing assistant application scenario. To facilitate advanced control for the Adaptive Robotic Nursing Assistant (ARNA) mobile manipulator co-bot that was designed and developed in our lab, we describe and evaluate an admittance control strategy that features a Neuroadaptive Controller (NAC). The NAC has been specifically formulated for pHRI applications such as patient walking. The controller continuously tunes weights of a neural network to cancel robot non-linearities, including drive train backlash, kinematic or dynamic coupling, variable patient pushing effort, or slope surfaces with unknown inclines. The advantage of our control strategy consists of Lyapunov stability guarantees during interaction, less need for parameter tuning and better performance across a variety of users and operating conditions. We conduct simulations and experiments with 10 users to confirm that the NAC outperforms a classic Proportional-Derivative (PD) joint controller in terms of resulting interaction jerk, user effort, and trajectory tracking error during patient walking. To tackle complex mechanisms of these next-gen robots wherein the use of encoder or other classic pose measuring device is not feasible, we present a study effects of design parameters on methods that use data from Inertial Measurement Units (IMU) in robot skins to provide robot state estimates. These parameters include number of sensors, their placement on the robot, as well as noise properties on the quality of robot pose estimation and its signal-to-noise Ratio (SNR). The results from that study facilitate the creation of robot skin, and in order to enable their use in complex robots, we propose a novel pose estimation method, the Generalized Common Mode Rejection (GCMR) algorithm, for estimation of joint angles in robot chains containing composite joints. The placement study and GCMR are demonstrated using both Gazebo simulation and experiments with a 3-DoF robotic arm containing 2 non-zero link lengths, 1 revolute joint and a 2-DoF composite joint. In addition to yielding insights on the predicted usage of co-bots, the design of control and sensing mechanisms in their CHMI benefits from evaluating the perception of the eventual users of these robots. With co-bots being only increasingly developed and used, there is a need for studies into these user perceptions using existing models that have been used in predicting usage of comparable technology. To this end, we use the Technology Acceptance Model (TAM) to evaluate the CHMI of the ARNA robot in a scenario via analysis of quantitative and questionnaire data collected during experiments with eventual uses. The results from the works conducted in this dissertation demonstrate insightful contributions to the realization of control and sensing systems that are part of CHMI\u27s for next generation co-bots

    Vision Experts: “Capturing the Holy Grail” Business Plan

    Get PDF
    This project investigates the potential viability of commercializing robotics software developed by a UBC engineer. The aim of this project is to provide the inventor with a business plan that will act as a tool to help in obtaining funding for the commercialization of this software. Through research and work, it has been concluded that the possibility does exist to use this software as the basis for a successful company. To that end, a business plan is presented with the goal of helping the developer achieve her goals

    Human-aware space sharing and navigation for an interactive robot

    Get PDF
    Les méthodes de planification de mouvements robotiques se sont développées à un rythme accéléré ces dernières années. L'accent a principalement été mis sur le fait de rendre les robots plus efficaces, plus sécurisés et plus rapides à réagir à des situations imprévisibles. En conséquence, nous assistons de plus en plus à l'introduction des robots de service dans notre vie quotidienne, en particulier dans les lieux publics tels que les musées, les centres commerciaux et les aéroports. Tandis qu'un robot de service mobile se déplace dans l'environnement humain, il est important de prendre en compte l'effet de son comportement sur les personnes qu'il croise ou avec lesquelles il interagit. Nous ne les voyons pas comme de simples machines, mais comme des agents sociaux et nous nous attendons à ce qu'ils se comportent de manière similaire à l'homme en suivant les normes sociétales comme des règles. Ceci a créé de nouveaux défis et a ouvert de nouvelles directions de recherche pour concevoir des algorithmes de commande de robot, qui fournissent des comportements de robot acceptables, lisibles et proactifs. Cette thèse propose une méthode coopérative basée sur l'optimisation pour la planification de trajectoire et la navigation du robot avec des contraintes sociales intégrées pour assurer des mouvements de robots prudents, conscients de la présence de l'être humain et prévisibles. La trajectoire du robot est ajustée dynamiquement et continuellement pour satisfaire ces contraintes sociales. Pour ce faire, nous traitons la trajectoire du robot comme une bande élastique (une construction mathématique représentant la trajectoire du robot comme une série de positions et une différence de temps entre ces positions) qui peut être déformée (dans l'espace et dans le temps) par le processus d'optimisation pour respecter les contraintes données. De plus, le robot prédit aussi les trajectoires humaines plausibles dans la même zone d'exploitation en traitant les chemins humains aussi comme des bandes élastiques. Ce système nous permet d'optimiser les trajectoires des robots non seulement pour le moment présent, mais aussi pour l'interaction entière qui se produit lorsque les humains et les robots se croisent les uns les autres. Nous avons réalisé un ensemble d'expériences avec des situations interactives humains-robots qui se produisent dans la vie de tous les jours telles que traverser un couloir, passer par une porte et se croiser sur de grands espaces ouverts. La méthode de planification coopérative proposée se compare favorablement à d'autres schémas de planification de la navigation à la pointe de la technique. Nous avons augmenté le comportement de navigation du robot avec un mouvement synchronisé et réactif de sa tête. Cela permet au robot de regarder où il va et occasionnellement de détourner son regard vers les personnes voisines pour montrer que le robot va éviter toute collision possible avec eux comme prévu par le planificateur. À tout moment, le robot pondère les multiples critères selon le contexte social et décide de ce vers quoi il devrait porter le regard. Grâce à une étude utilisateur en ligne, nous avons montré que ce mécanisme de regard complète efficacement le comportement de navigation ce qui améliore la lisibilité des actions du robot. Enfin, nous avons intégré notre schéma de navigation avec un système de supervision plus large qui peut générer conjointement des comportements du robot standard tel que l'approche d'une personne et l'adaptation de la vitesse du robot selon le groupe de personnes que le robot guide dans des scénarios d'aéroport ou de musée.The methods of robotic movement planning have grown at an accelerated pace in recent years. The emphasis has mainly been on making robots more efficient, safer and react faster to unpredictable situations. As a result we are witnessing more and more service robots introduced in our everyday lives, especially in public places such as museums, shopping malls and airports. While a mobile service robot moves in a human environment, it leaves an innate effect on people about its demeanor. We do not see them as mere machines but as social agents and expect them to behave humanly by following societal norms and rules. This has created new challenges and opened new research avenues for designing robot control algorithms that deliver human-acceptable, legible and proactive robot behaviors. This thesis proposes a optimization-based cooperative method for trajectoryplanning and navigation with in-built social constraints for keeping robot motions safe, human-aware and predictable. The robot trajectory is dynamically and continuously adjusted to satisfy these social constraints. To do so, we treat the robot trajectory as an elastic band (a mathematical construct representing the robot path as a series of poses and time-difference between those poses) which can be deformed (both in space and time) by the optimization process to respect given constraints. Moreover, we also predict plausible human trajectories in the same operating area by treating human paths also as elastic bands. This scheme allows us to optimize the robot trajectories not only for the current moment but for the entire interaction that happens when humans and robot cross each other's paths. We carried out a set of experiments with canonical human-robot interactive situations that happen in our everyday lives such as crossing a hallway, passing through a door and intersecting paths on wide open spaces. The proposed cooperative planning method compares favorably against other stat-of-the-art human-aware navigation planning schemes. We have augmented robot navigation behavior with synchronized and responsive movements of the robot head, making the robot look where it is going and occasionally diverting its gaze towards nearby people to acknowledge that robot will avoid any possible collision with them. At any given moment the robot weighs multiple criteria according to the social context and decides where it should turn its gaze. Through an online user study we have shown that such gazing mechanism effectively complements the navigation behavior and it improves legibility of the robot actions. Finally, we have integrated our navigation scheme with a broader supervision system which can jointly generate normative robot behaviors such as approaching a person and adapting the robot speed according to a group of people who the robot guides in airports or museums

    Adaptive physical human-robot interaction (PHRI) with a robotic nursing assistant.

    Get PDF
    Recently, more and more robots are being investigated for future applications in health-care. For instance, in nursing assistance, seamless Human-Robot Interaction (HRI) is very important for sharing workspaces and workloads between medical staff, patients, and robots. In this thesis we introduce a novel robot - the Adaptive Robot Nursing Assistant (ARNA) and its underlying components. ARNA has been designed specifically to assist nurses with day-to-day tasks such as walking patients, pick-and-place item retrieval, and routine patient health monitoring. An adaptive HRI in nursing applications creates a positive user experience, increase nurse productivity and task completion rates, as reported by experimentation with human subjects. ARNA has been designed to include interface devices such as tablets, force sensors, pressure-sensitive robot skins, LIDAR and RGBD camera. These interfaces are combined with adaptive controllers and estimators within a proposed framework that contains multiple innovations. A research study was conducted on methods of deploying an ideal HumanMachine Interface (HMI), in this case a tablet-based interface. Initial study points to the fact that a traded control level of autonomy is ideal for tele-operating ARNA by a patient. The proposed method of using the HMI devices makes the performance of a robot similar for both skilled and un-skilled workers. A neuro-adaptive controller (NAC), which contains several neural-networks to estimate and compensate for system non-linearities, was implemented on the ARNA robot. By linearizing the system, a cross-over usability condition is met through which humans find it more intuitive to learn to use the robot in any location of its workspace, A novel Base-Sensor Assisted Physical Interaction (BAPI) controller is introduced in this thesis, which utilizes a force-torque sensor at the base of the ARNA robot manipulator to detect full body collisions, and make interaction safer. Finally, a human-intent estimator (HIE) is proposed to estimate human intent while the robot and user are physically collaborating during certain tasks such as adaptive walking. A NAC with HIE module was validated on a PR2 robot through user studies. Its implementation on the ARNA robot platform can be easily accomplished as the controller is model-free and can learn robot dynamics online. A new framework, Directive Observer and Lead Assistant (DOLA), is proposed for ARNA which enables the user to interact with the robot in two modes: physically, by direct push-guiding, and remotely, through a tablet interface. In both cases, the human is being “observed” by the robot, then guided and/or advised during interaction. If the user has trouble completing the given tasks, the robot adapts their repertoire to lead users toward completing goals. The proposed framework incorporates interface devices as well as adaptive control systems in order to facilitate a higher performance interaction between the user and the robot than was previously possible. The ARNA robot was deployed and tested in a hospital environment at the School of Nursing of the University of Louisville. The user-experience tests were conducted with the help of healthcare professionals where several metrics including completion time, rate and level of user satisfaction were collected to shed light on the performance of various components of the proposed framework. The results indicate an overall positive response towards the use of such assistive robot in the healthcare environment. The analysis of these gathered data is included in this document. To summarize, this research study makes the following contributions: Conducting user experience studies with the ARNA robot in patient sitter and walker scenarios to evaluate both physical and non-physical human-machine interfaces. Evaluation and Validation of Human Intent Estimator (HIE) and Neuro-Adaptive Controller (NAC). Proposing the novel Base-Sensor Assisted Physical Interaction (BAPI) controller. Building simulation models for packaged tactile sensors and validating the models with experimental data. Description of Directive Observer and Lead Assistance (DOLA) framework for ARNA using adaptive interfaces

    Human-Robot Collaborations in Industrial Automation

    Get PDF
    Technology is changing the manufacturing world. For example, sensors are being used to track inventories from the manufacturing floor up to a retail shelf or a customer’s door. These types of interconnected systems have been called the fourth industrial revolution, also known as Industry 4.0, and are projected to lower manufacturing costs. As industry moves toward these integrated technologies and lower costs, engineers will need to connect these systems via the Internet of Things (IoT). These engineers will also need to design how these connected systems interact with humans. The focus of this Special Issue is the smart sensors used in these human–robot collaborations
    • …
    corecore