3 research outputs found

    From human-human collaboration to human-robot collaboration: automated generation of assembly task knowledge model

    Get PDF
    Task knowledge is essential for robots to proactively perform collaborative assembly tasks with a human partner. Representation of task knowledge, such as task graphs, robot skill libraries, are usually manually defined by human experts. In this paper, different from learning from demonstrations of a single agent, we propose a system that automatically constructs task knowledge models from dual-human demonstrations in the real environment. Firstly, we track and segment video demonstrations into sequences of action primitives. Secondly, a graph-based algorithm is proposed to extract structure information of a task from action sequences, with task graphs as output. Finally, action primitives, along with interactive information between agents, temporal constraints, are modelled into a structured semantic model. The proposed system is validated in an IKEA table assembly task experiment
    corecore