11 research outputs found
Flexible multi-sensorial system for automatic disassembly using cooperative robots
Flexible multisensorial systems are a very important issue in the current industry when
disassembling and recycling tasks have to be performed. These tasks can be performed by
a human operator or by a robot system. In the current paper a robotic system to perform
the required tasks is presented. This system takes into consideration the distribution of
the necessary tasks to perform the disassembly of a component using several robots in a
parallel or in a cooperative way. The algorithm proposed to distribute the task among
robots takes into consideration the characteristics of each task and the sequence that
needs to be followed to perform the required disassembly of the product. Furthermore,
this paper presents a disassembly system based on a sensorized cooperative robots
interaction framework for the planning of movements and detections of objects in the
disassembly tasks. To determine the sequence of the disassembly of some products, a new
strategy to distribute a set of tasks among robots is presented. Subsequently, the visual
detection system used for detecting targets and characteristics is described. To carry out
this detection process, different well known strategies, such as matching templates,
polygonal approach and edge detection, are applied. Finally, a visual-force control
system has been implemented in order to track disassembly trajectories. An important
aspect of this system is the processing of the sensorial information in order to guarantee
coherence. This aspect allows the application of both sensors, visual and force sensors,
co-ordinately to disassembly tasks. The proposed system is validated by experiments
using several types of components such as the covers of batteries and electronic circuits
from toys, and drives and screws from PCs.This work was funded by the Spanish MCYT project
‘Diseño, implementación y experimentación de escenarios
de manipulación inteligentes para aplicaciones de ensamblado y desensamblado automático (DPI2005-06222)’ and by the project ‘DESAURO: Desensamblado
Automático Selectivo para Reciclado mediante Robots
Cooperativos y Sistema Multisensorial’ (DPI2002-02103)
Robotic disassembly of waste electrical and electronic equipment
Waste electrical and electronic equipment (WEEE) is the world’s fastest growing form of waste. Inappropriate disposal of WEEE causes damage to ecosystems and local communities due to hazardous materials and toxic chemicals present in electronic products. High value metals in small quantities are dissipated and embodied energy from manufacturing are lost in shredding and crushing treatments of WEEE. On the other hand, manual disassembly is costly and presents safety concerns for human workers. Therefore, robotic disassembly is an ideal approach to addressing the treatment of WEEE. Despite extensive research in the field, large variations and uncertainties in product structures, models, and conditions is a major limitation to the implementation of automation and robotics in the waste industry. The ability of a robotic disassembly system to learn new product structures and reason about existing knowledge of product structure is vital to addressing this challenge.
This thesis explores robotic disassembly for WEEE by building upon an existing research disassembly rig for LCD monitors and expanding it to address other product families. The updated disassembly system utilizes a modular framework consisting of a Cognition module, Perception module, and Operation module, in order to address the uncertainties present in end-of-life (EoL) products. A novel disassembly ontology is designed and developed with an upper and lower ontology structure to represent generic disassembly knowledge and product-family-specific knowledge respectively. Furthermore, a Learning framework enables automated expansion of the ontology using past disassembly experiences and user-demonstration. These presented methodologies form the main function of the Cognition module, which aids the Perception module and instructs the Operation module. The disassembly ontology and Learning framework are verified independently from the rest of the system prior to being integrated and validated with real disassembly runs of LCD monitors and keyboards. As such, the disassembly system’s ability to address both known and unknown EoL product types, as well as learn new product types, is demonstrated