123 research outputs found

    Monitoring companion for industrial robotic processes

    Get PDF
    For system integrators, optimizing complex industrial robotic applications (e.g. robotised welding) is a difficult and time-consuming task. This procedure is rendered tedious and often very hard to achieve when the operator cannot access the robotic system once in operation, perhaps because the installation is far away or because of the operational environment. In these circumstances, as an alternative to physically visiting the installation site, the system integrator may rely on additional nearby sensors to remotely acquire the necessary process information. While it is hard to completely replace this trial and error approach, it is possible to provide a way to gather process information more effectively that can be used in several robotic installations.This thesis investigates the use of a "monitoring robot" in addition to the task robot(s) that belong to the industrial process to be optimized. The monitoring robot can be equipped with several different sensors and can be moved into close proximity of any installed task robot so that it can be used to collect information from that process during and/or after the operation without interfering. The thesis reviews related work in the industry and in the field of teleoperation to identify the most important challenges in remote monitoring and teleoperation. From the background investigation it is clear that two very important issues are: i) the nature of the teleoperator’s interface and; ii) the efficiency of the shared control between the human operator and the monitoring system. In order to investigate these two issues efficiently it was necessary to create experimental scenarios that operate independently from any application scenario, so an abstract problem domain is created. This way the monitoring system's control and interface can be evaluated in a context that presents challenges that are typical of a remote monitoring task but are not application domain specific. Therefore the validity of the proposed approach can be assessed from a generic and, therefore, more powerful and widely applicable perspective. The monitoring framework developed in this thesis is described, both in the shared control design choices based on virtual fixtures (VF) and the implementation in a 3D visualization environment. The monitoring system developed is evaluated with a usability study with user participants. The usability study aims at assessing the system's performance along with its acceptance and ease of use in a static monitoring task, accompanied by user\hyp{}filled TLX questionnaires. Since future work will apply this system in real robotic welding scenarios, this thesis finally reports some preliminary work in such an application

    The IFMIF-DONES remote handling control system: Experimental setup for OPC UA integration

    Get PDF
    The devices used to carry out Remote Handling (RH) manipulation tasks in radiation environments address requirements that are significantly different from common robotic and industrial systems due to the lack of repetitive operations and incompletely specified control actions. This imposes the need of control with human-in -the-loop operations. These RH systems are used on facilities such PRIDE, CERN, ESS, ITER or IFMIF-DONES, the reference used for this work. For the RH system is crucial to provide high availability, robustness against radiation, haptic devices for teleoperation and dexterous operation, and smooth coordination and integration with the centralized control room. To achieve this purpose is necessary to find the best approach towards a standard control framework capable of providing a standard set of functionalities, tools, interfaces, communications, and data formats to the different types of mechatronic devices that are usually considered for Remote Handling tasks. This previous phase of homogenization is not considered in most facilities, which leads towards a costly integration process during the commissioning phase of the facility.In this paper, an approach to the IFMIF-DONES RH Control framework with strong standard support based on protocols such as OPC UA has been described and validated through an experimental setup. This test bench includes a set of physical devices (PLC, conveyor belt and computers) and a set of OPC UA compatible software tools, configured and operable from any node of the University of Granada network. This proof-of-concept mockup provides flexibility to modify the dimension and complexity of the setup by using new virtual or physical devices connected to a unique backbone. Besides, it will be used to test different aspects such as control schemes, failure injection, network modeling, predictive maintenance studies, operator training on simulated/ real scenarios, usability or ergonomics of the user interfaces before the deployment. In this contribution, the results are described and illustrated using a conveyor belt set-up, a small but representative reference used to validate the RH control concepts here proposed.European Union via the Euratom Research and Training Programme 101052200 - EUROfusio

    Digital Twin of a Teaching and Learning Robotics Lab

    Get PDF
    The advancing technologies of Industry 4.0, which includes digital twins, is gaining ground and becoming more popular in many industrial sectors. In the manufacturing industry, digital twins are used, ranging from simulation to product optimisation. This work focuses on using LiDAR data, SLAM algorithms and basic measure tape for developing a digital twin environment in the open-source platform Gazebo backed by ROS, which scientists, engineers, and students will use to streamline development process, for educational purposes and many more. The work results show a digital replica of specific areas of the Institute of Technology, where multiple robots can be integrated and controlled. Such a platform creates a foundation for improving distance learning and safe initial system testing

    A Common Digital Twin Platform for Education, Training and Collaboration

    Get PDF
    The world is in transition driven by digitalization; industrial companies and educational institutions are adopting Industry 4.0 and Education 4.0 technologies enabled by digitalization. Furthermore, digitalization and the availability of smart devices and virtual environments have evolved to pro- duce a generation of digital natives. These digital natives whose smart devices have surrounded them since birth have developed a new way to process information; instead of reading literature and writing essays, the digital native generation uses search engines, discussion forums, and on- line video content to study and learn. The evolved learning process of the digital native generation challenges the educational and industrial sectors to create natural training, learning, and collaboration environments for digital natives. Digitalization provides the tools to overcome the aforementioned challenge; extended reality and digital twins enable high-level user interfaces that are natural for the digital natives and their interaction with physical devices. Simulated training and education environments enable a risk-free way of training safety aspects, programming, and controlling robots. To create a more realistic training environment, digital twins enable interfacing virtual and physical robots to train and learn on real devices utilizing the virtual environment. This thesis proposes a common digital twin platform for education, training, and collaboration. The proposed solution enables the teleoperation of physical robots from distant locations, enabling location and time-independent training and collaboration in robotics. In addition to teleoperation, the proposed platform supports social communication, video streaming, and resource sharing for efficient collaboration and education. The proposed solution enables research collaboration in robotics by allowing collaborators to utilize each other’s equipment independent of the distance between the physical locations. Sharing of resources saves time and travel costs. Social communication provides the possibility to exchange ideas and discuss research. The students and trainees can utilize the platform to learn new skills in robotic programming, controlling, and safety aspects. Cybersecurity is considered from the planning phase to the implementation phase. Only cybersecure methods, protocols, services, and components are used to implement the presented platform. Securing the low-level communication layer of the digital twins is essential to secure the safe teleoperation of the robots. Cybersecurity is the key enabler of the proposed platform, and after implementation, periodic vulnerability scans and updates enable maintaining cybersecurity. This thesis discusses solutions and methods for cyber securing an online digital twin platform. In conclusion, the thesis presents a common digital twin platform for education, training, and collaboration. The presented solution is cybersecure and accessible using mobile devices. The proposed platform, digital twin, and extended reality user interfaces contribute to the transitions to Education 4.0 and Industry 4.0

    Study of Augmented Reality based manufacturing for further integration of quality control 4.0: a systematic literature review

    Get PDF
    Augmented Reality (AR) has gradually become a mainstream technology enabling Industry 4.0 and its maturity has also grown over time. AR has been applied to support different processes on the shop-floor level, such as assembly, maintenance, etc. As various processes in manufacturing require high quality and near-zero error rates to ensure the demands and safety of end-users, AR can also equip operators with immersive interfaces to enhance productivity, accuracy and autonomy in the quality sector. However, there is currently no systematic review paper about AR technology enhancing the quality sector. The purpose of this paper is to conduct a systematic literature review (SLR) to conclude about the emerging interest in using AR as an assisting technology for the quality sector in an industry 4.0 context. Five research questions (RQs), with a set of selection criteria, are predefined to support the objectives of this SLR. In addition, different research databases are used for the paper identification phase following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) methodology to find the answers for the predefined RQs. It is found that, in spite of staying behind the assembly and maintenance sector in terms of AR-based solutions, there is a tendency towards interest in developing and implementing AR-assisted quality applications. There are three main categories of current AR-based solutions for quality sector, which are AR-based apps as a virtual Lean tool, AR-assisted metrology and AR-based solutions for in-line quality control. In this SLR, an AR architecture layer framework has been improved to classify articles into different layers which are finally integrated into a systematic design and development methodology for the development of long-term AR-based solutions for the quality sector in the future

    iviz: A ROS Visualization App for Mobile Devices

    Get PDF
    In this work, we introduce iviz, a mobile application for visualizing ROS data. In the last few years, the popularity of ROS has grown enormously, making it the standard platform for open source robotic programming. A key reason for this success is the availability of polished, general-purpose modules for many tasks, such as localization, mapping, path planning, and quite importantly, data visualization. However, the availability of the latter is generally restricted to PCs with the Linux operating system. Thus, users that want to see what is happening in the system with a smartphone or a tablet are stuck with solutions such as screen mirroring or using web browser versions of rviz, which are difficult to interact with from a mobile interface. More importantly, this makes newer visualization modalities such as Augmented Reality impossible. Our application iviz, based on the Unity engine, addresses these issues by providing a visualization platform designed from scratch to be usable in mobile platforms, such as iOS, Android, and UWP, and including native support for Augmented Reality for all three platforms. If desired, it can also be used in a PC with Linux, Windows, or macOS without any changes.Comment: This work has 7 pages and 7 figures. The repository of the project can be found in https://github.com/KIT-ISAS/iviz/tree/deve

    Augmented reality (AR) for surgical robotic and autonomous systems: State of the art, challenges, and solutions

    Get PDF
    Despite the substantial progress achieved in the development and integration of augmented reality (AR) in surgical robotic and autonomous systems (RAS), the center of focus in most devices remains on improving end-effector dexterity and precision, as well as improved access to minimally invasive surgeries. This paper aims to provide a systematic review of different types of state-of-the-art surgical robotic platforms while identifying areas for technological improvement. We associate specific control features, such as haptic feedback, sensory stimuli, and human-robot collaboration, with AR technology to perform complex surgical interventions for increased user perception of the augmented world. Current researchers in the field have, for long, faced innumerable issues with low accuracy in tool placement around complex trajectories, pose estimation, and difficulty in depth perception during two-dimensional medical imaging. A number of robots described in this review, such as Novarad and SpineAssist, are analyzed in terms of their hardware features, computer vision systems (such as deep learning algorithms), and the clinical relevance of the literature. We attempt to outline the shortcomings in current optimization algorithms for surgical robots (such as YOLO and LTSM) whilst providing mitigating solutions to internal tool-to-organ collision detection and image reconstruction. The accuracy of results in robot end-effector collisions and reduced occlusion remain promising within the scope of our research, validating the propositions made for the surgical clearance of ever-expanding AR technology in the future

    Augmented reality for safety zones in human-robot collaboration

    Get PDF
    Worker productivity in manufacturing could be increased by reducing the distance between robots and humans in human-robot collaboration (HRC). However, physical cages generally limit this interaction. We use Augmented Reality (AR) to visualise virtual safety zones on a real robot arm, thereby replacing the physical cages and bringing humans and robots closer together. We demonstrate this with a collaborative pick and place application that makes use of a Universal Robots 10 (UR10) robot arm and a Microsoft HoloLens 2 for control and visualisation. This mimics a real task in an industrial robot cell. The virtual safety zone sizes are based on ISO standards for HRC. However, we are the first to also consider hardware and network latencies in the calculations of the virtual safety zone sizes

    ROBOFERT: Human - Robot Advanced Interface for Robotic Fertilization Process

    Get PDF
    The interfaces for Human-Robot interaction in different fields such as precision agriculture (PA) have made it possible to improve production processes, applying specialized treatments that require a high degree of precision at the plant level. The current fertilization processes are generalized for vast cultivation areas without considering each plant’s specific needs, generating collateral effects on the environment. The Sureveg Core Organic COfound ERA-Net project seeks to evaluate the benefits of growing vegetables in rows through the support of robotic systems. A robotic platform equipped with sensory, actuation, and communication systems and a robotic arm have been implemented to develop this proof of concept. The proposed method focuses on the development of a human-machine interface (IHM) that allows the integration of information coming from different systems from the robotized platform on the field and suggest to an operator (in a remote station) take a fertilization action based on specific vegetative needs to improve vegetable production. The proposed interface was implemented using Robot Operating System (ROS) and allows: visualizing the states of the robot within the crop by using a highly realistic environment developed in Unity3D and shows specific information of the plants’ vegetative data fertilization needs and suggests the user take action. The tests to validate the method have been carried out in the fields of the ETSIAAB-UPM. According to the multi-spectral data taken after (2 weeks after being planted) and before (3 months after growth), main results have shown that NDVI indexes mean values in the row crop vegetables have normal levels around 0.4 concerning initial NDVI values, and its growth was homogeneous, validating the influence of ROBOFER
    corecore