13 research outputs found

    Adaptive inspection cell for HMI consoles

    Get PDF
    The actual control quality standards require manufacturers to increase the inspection process. Instead of a sampling method, all items should be inspected and different equipment with different characteristics in the inspection cell need an adaptive system and the control quality cells should be enhanced. The presented work describes a self-adaptable robotized inspection cell for HMI consoles. which comprises the image acquisition system with controlled illumination and a force feedback sensor manipulated by a collaborative robot. The developed robotized cell is capable of detecting different HMI consoles and adapting the inspection routines of the manipulator robot according to the specific console. Moreover, the flexibility of the collaborative robot allows to adapt the camera positioning, lighting, and distance in a way that future HMI consoles can be inspected based on learning strategies.The work reported in this paper was supported by the Norte2020 program under the project I40@TMAD - Promoção da Indústria 4.0 na Região de Trás-os-Montes e Alto Douro.info:eu-repo/semantics/publishedVersio

    A model-based residual approach for human-robot collaboration during manual polishing operations

    Get PDF
    A fully robotized polishing of metallic surfaces may be insufficient in case of parts with complex geometric shapes, where a manual intervention is still preferable. Within the EU SYMPLEXITY project, we are considering tasks where manual polishing operations are performed in strict physical Human-Robot Collaboration (HRC) between a robot holding the part and a human operator equipped with an abrasive tool. During the polishing task, the robot should firmly keep the workpiece in a prescribed sequence of poses, by monitoring and resisting to the external forces applied by the operator. However, the user may also wish to change the orientation of the part mounted on the robot, simply by pushing or pulling the robot body and changing thus its configuration. We propose a control algorithm that is able to distinguish the external torques acting at the robot joints in two components, one due to the polishing forces being applied at the end-effector level, the other due to the intentional physical interaction engaged by the human. The latter component is used to reconfigure the manipulator arm and, accordingly, its end-effector orientation. The workpiece position is kept instead fixed, by exploiting the intrinsic redundancy of this subtask. The controller uses a F/T sensor mounted at the robot wrist, together with our recently developed model-based technique (the residual method) that is able to estimate online the joint torques due to contact forces/torques applied at any place along the robot structure. In order to obtain a reliable residual, which is necessary to implement the control algorithm, an accurate robot dynamic model (including also friction effects at the joints and drive gains) needs to be identified first. The complete dynamic identification and the proposed control method for the human-robot collaborative polishing task are illustrated on a 6R UR10 lightweight manipulator mounting an ATI 6D sensor

    On the use of a temperature based friction model for a virtual force sensor in industrial robot manipulators

    Get PDF
    In this paper we propose the use of a dynamic model in which the effects of temperature on friction are considered to develop a virtual force sensor for industrial robot manipulators. The estimation of the inertial parameters and of the friction model are explained. The effectiveness of the virtual force sensor has been proven in a polishing task. In fact, the interaction forces between the robot and the environment has been measured both with the virtual force sensor and a common load cell. Moreover, the advantages provided by considering the temperature dependency are highlighted

    Calibration and External Force Sensing for Soft Robots using an RGB-D Camera

    Get PDF
    International audienceBenefiting from the deformability of soft robots, calibration and force sensing for soft robots are possible using an external vision-based system, instead of embedded mechatronic force sensors. In this paper, we first propose a calibration method to calibrate both the sensor-robot coordinate system and the actuator inputs. This task is addressed through a sequential optimization problem for both variables. We also introduce an external force sensing system based on a real-time Finite Element (FE) model with the assumption of static configurations, and which consists of two steps: force location detection and force intensity computation. The algorithm that estimates force location relies on the segmentation of the point cloud acquired by an RGB-D camera. Then, the force intensities can be computed by solving an inverse quasi-static problem based on matching the FE model with the point cloud of the soft robot. As for validation, the proposed strategies for calibration and force sensing have been tested using a parallel soft robot driven by four cables

    Proprioception for collision detection

    Get PDF
    Nowadays a lot of research is done in the eld of human-robot interaction to allow robots to work in human environments. Haptic perception is crucial for human and robot safety since it allows to detect contacts. It is also important for surface exploration be able to detect contact between a robot and its environment. This thesis presents a probabilistic representation of contacts in order to complement or in some cases replace computer vision systems. Torque measurements were used to determine probable collisions (i.e. positions and forces) on robot links. The suggested algorithm were tested in simulation and afterwards experiments were done with the robot arm Kuka LWR

    Object Handovers: a Review for Robotics

    Full text link
    This article surveys the literature on human-robot object handovers. A handover is a collaborative joint action where an agent, the giver, gives an object to another agent, the receiver. The physical exchange starts when the receiver first contacts the object held by the giver and ends when the giver fully releases the object to the receiver. However, important cognitive and physical processes begin before the physical exchange, including initiating implicit agreement with respect to the location and timing of the exchange. From this perspective, we structure our review into the two main phases delimited by the aforementioned events: 1) a pre-handover phase, and 2) the physical exchange. We focus our analysis on the two actors (giver and receiver) and report the state of the art of robotic givers (robot-to-human handovers) and the robotic receivers (human-to-robot handovers). We report a comprehensive list of qualitative and quantitative metrics commonly used to assess the interaction. While focusing our review on the cognitive level (e.g., prediction, perception, motion planning, learning) and the physical level (e.g., motion, grasping, grip release) of the handover, we briefly discuss also the concepts of safety, social context, and ergonomics. We compare the behaviours displayed during human-to-human handovers to the state of the art of robotic assistants, and identify the major areas of improvement for robotic assistants to reach performance comparable to human interactions. Finally, we propose a minimal set of metrics that should be used in order to enable a fair comparison among the approaches.Comment: Review paper, 19 page

    Physical Interaction of Autonomous Robots in Complex Environments

    Get PDF
    Recent breakthroughs in the fields of computer vision and robotics are firmly changing the people perception about robots. The idea of robots that substitute humansisnowturningintorobotsthatcollaboratewiththem. Serviceroboticsconsidersrobotsaspersonalassistants. Itsafelyplacesrobotsindomesticenvironments in order to facilitate humans daily life. Industrial robotics is now reconsidering its basic idea of robot as a worker. Currently, the primary method to guarantee the personnels safety in industrial environments is the installation of physical barriers around the working area of robots. The development of new technologies and new algorithms in the sensor field and in the robotic one has led to a new generation of lightweight and collaborative robots. Therefore, industrial robotics leveraged the intrinsic properties of this kind of robots to generate a robot co-worker that is able to safely coexist, collaborate and interact inside its workspace with both personnels and objects. This Ph.D. dissertation focuses on the generation of a pipeline for fast object pose estimation and distance computation of moving objects,in both structured and unstructured environments,using RGB-D images. This pipeline outputs the command actions which let the robot complete its main task and fulfil the safety human-robot coexistence behaviour at once. The proposed pipeline is divided into an object segmentation part,a 6D.o.F. object pose estimation part and a real-time collision avoidance part for safe human-robot coexistence. Firstly, the segmentation module finds candidate object clusters out of RGB-D images of clutter scenes using a graph-based image segmentation technique. This segmentation technique generates a cluster of pixels for each object found in the image. The candidate object clusters are then fed as input to the 6 D.o.F. object pose estimation module. The latter is in charge of estimating both the translation and the orientation in 3D space of each candidate object clusters. The object pose is then employed by the robotic arm to compute a suitable grasping policy. The last module generates a force vector field of the environment surrounding the robot, the objects and the humans. This force vector field drives the robot toward its goal while any potential collision against objects and/or humans is safely avoided. This work has been carried out at Politecnico di Torino, in collaboration with Telecom Italia S.p.A

    Progress and Prospects of the Human-Robot Collaboration

    Get PDF
    International audienceRecent technological advances in hardware designof the robotic platforms enabled the implementationof various control modalities for improved interactions withhumans and unstructured environments. An important applicationarea for the integration of robots with such advancedinteraction capabilities is human-robot collaboration. Thisaspect represents high socio-economic impacts and maintainsthe sense of purpose of the involved people, as the robotsdo not completely replace the humans from the workprocess. The research community’s recent surge of interestin this area has been devoted to the implementation of variousmethodologies to achieve intuitive and seamless humanrobot-environment interactions by incorporating the collaborativepartners’ superior capabilities, e.g. human’s cognitiveand robot’s physical power generation capacity. In fact,the main purpose of this paper is to review the state-of-thearton intermediate human-robot interfaces (bi-directional),robot control modalities, system stability, benchmarking andrelevant use cases, and to extend views on the required futuredevelopments in the realm of human-robot collaboration

    Bewegungsregelung mobiler Manipulatoren für die Mensch-Roboter-Interaktion mittels kartesischer modellprädiktiver Regelung

    Get PDF
    Für die Mensch-Roboter-Interaktion wird in dieser Arbeit eine Methode zur Überwachung der komplexen, dynamischen Roboterumgebung vorgestellt. Die Roboterbewegung wird basierend auf dem Konzept der modellprädiktiven Regelung unter Berücksichtigung der detektierten Hindernisse und der stattfindenden Kontakte des Roboters mit seiner Umgebung geregelt, um Kollisionen zu vermeiden und angemessen auf Kontakte zu reagieren. Die Ansätze werden auf einem mobilen Manipulator validiert
    corecore