Unifying Skill-Based Programming and Programming by Demonstration through Ontologies

Abstract

Smart manufacturing requires easily reconfigurable robotic systems to increase the flexibility in presence of market uncertainties by reducing the set-up times for new tasks. One enabler of fast reconfigurability is given by intuitive robot programming methods. On the one hand, offline skill-based programming (OSP) allows the definition of new tasks by sequencing pre-defined, parameterizable building blocks termed as skills in a graphical user interface. On the other hand, programming by demonstration (PbD) is a well known technique that uses kinesthetic teaching for intuitive robot programming, where this work presents an approach to automatically recognize skills from the human demonstration and parameterize them using the recorded data. The approach further unifies both programming modes of OSP and PbD with the help of an ontological knowledge base and empowers the end user to choose the preferred mode for each phase of the task. In the experiments, we evaluate two scenarios with different sequences of programming modes being selected by the user to define a task. In each scenario, skills are recognized by a data-driven classifier and automatically parameterized from the recorded data. The fully defined tasks consist of both manually added and automatically recognized skills and are executed in the context of a realistic industrial assembly environment

    Similar works