11,878 research outputs found

    Sharing emotions and space - empathy as a basis for cooperative spatial interaction

    Get PDF
    Boukricha H, Nguyen N, Wachsmuth I. Sharing emotions and space - empathy as a basis for cooperative spatial interaction. In: Kopp S, Marsella S, Thorisson K, Vilhjalmsson HH, eds. Proceedings of the 11th International Conference on Intelligent Virtual Agents (IVA 2011). LNAI. Vol 6895. Berlin, Heidelberg: Springer; 2011: 350-362.Empathy is believed to play a major role as a basis for humans’ cooperative behavior. Recent research shows that humans empathize with each other to different degrees depending on several modulation factors including, among others, their social relationships, their mood, and the situational context. In human spatial interaction, partners share and sustain a space that is equally and exclusively reachable to them, the so-called interaction space. In a cooperative interaction scenario of relocating objects in interaction space, we introduce an approach for triggering and modulating a virtual humans cooperative spatial behavior by its degree of empathy with its interaction partner. That is, spatial distances like object distances as well as distances of arm and body movements while relocating objects in interaction space are modulated by the virtual human’s degree of empathy. In this scenario, the virtual human’s empathic emotion is generated as a hypothesis about the partner’s emotional state as related to the physical effort needed to perform a goal directed spatial behavior

    Adaptive, fast walking in a biped robot under neuronal control and learning

    Get PDF
    Human walking is a dynamic, partly self-stabilizing process relying on the interaction of the biomechanical design with its neuronal control. The coordination of this process is a very difficult problem, and it has been suggested that it involves a hierarchy of levels, where the lower ones, e.g., interactions between muscles and the spinal cord, are largely autonomous, and where higher level control (e.g., cortical) arises only pointwise, as needed. This requires an architecture of several nested, sensori–motor loops where the walking process provides feedback signals to the walker's sensory systems, which can be used to coordinate its movements. To complicate the situation, at a maximal walking speed of more than four leg-lengths per second, the cycle period available to coordinate all these loops is rather short. In this study we present a planar biped robot, which uses the design principle of nested loops to combine the self-stabilizing properties of its biomechanical design with several levels of neuronal control. Specifically, we show how to adapt control by including online learning mechanisms based on simulated synaptic plasticity. This robot can walk with a high speed (> 3.0 leg length/s), self-adapting to minor disturbances, and reacting in a robust way to abruptly induced gait changes. At the same time, it can learn walking on different terrains, requiring only few learning experiences. This study shows that the tight coupling of physical with neuronal control, guided by sensory feedback from the walking pattern itself, combined with synaptic learning may be a way forward to better understand and solve coordination problems in other complex motor tasks

    Analyzing Whole-Body Pose Transitions in Multi-Contact Motions

    Full text link
    When executing whole-body motions, humans are able to use a large variety of support poses which not only utilize the feet, but also hands, knees and elbows to enhance stability. While there are many works analyzing the transitions involved in walking, very few works analyze human motion where more complex supports occur. In this work, we analyze complex support pose transitions in human motion involving locomotion and manipulation tasks (loco-manipulation). We have applied a method for the detection of human support contacts from motion capture data to a large-scale dataset of loco-manipulation motions involving multi-contact supports, providing a semantic representation of them. Our results provide a statistical analysis of the used support poses, their transitions and the time spent in each of them. In addition, our data partially validates our taxonomy of whole-body support poses presented in our previous work. We believe that this work extends our understanding of human motion for humanoids, with a long-term objective of developing methods for autonomous multi-contact motion planning.Comment: 8 pages, IEEE-RAS International Conference on Humanoid Robots (Humanoids) 201

    Adaptive shared control system

    Get PDF

    Intelligent humanoids in manufacturing to address worker shortage and skill gaps: Case of Tesla Optimus

    Full text link
    Technological evolution in the field of robotics is emerging with major breakthroughs in recent years. This was especially fostered by revolutionary new software applications leading to humanoid robots. Humanoids are being envisioned for manufacturing applications to form human-robot teams. But their implication in manufacturing practices especially for industrial safety standards and lean manufacturing practices have been minimally addressed. Humanoids will also be competing with conventional robotic arms and effective methods to assess their return on investment are needed. To study the next generation of industrial automation, we used the case context of the Tesla humanoid robot. The company has recently unveiled its project on an intelligent humanoid robot named Optimus to achieve an increased level of manufacturing automation. This article proposes a framework to integrate humanoids for manufacturing automation and also presents the significance of safety standards of human-robot collaboration. A case of lean assembly cell for the manufacturing of an open-source medical ventilator was used for human-humanoid automation. Simulation results indicate that humanoids can increase the level of manufacturing automation. Managerial and research implications are presented

    Predicting and preventing mistakes in human-robot collaborative assembly

    Get PDF
    The human-robot collaboration (HRC) in industrial assembly cells leads to great benefits by combining the flexibility of human worker with the accuracy and strength of robot. On the other hand, collaborative works between such different operators can generate risks and faults unknown in current industrial processes, either manual or automatic. To fully exploit the new collaborative paradigm, it is therefore essential to identify these risks before the collaborative robots are introduced in industry and start working together with humans. In the present study the authors analyze a benchmark set of general assembly tasks performed by HRC in a laboratory environment. The analyses are executed with the use of an adapted Process Failure Mode and Effects Analysis (PFMEA) to identify potential mistakes which can be made by human operator and robot. The outcomes are employed to define proper mistake proofing methods to be applied in the HRC assembly work cell
    corecore