2,507 research outputs found

    a human in the loop cyber physical system for collaborative assembly in smart manufacturing

    Get PDF
    Abstract Industry 4.0 rose with the introduction of cyber-physical systems (CPS) and Internet of things (IoT) inside manufacturing systems. CPS represent self-controlled physical processes, having tight networking capabilities and efficient interfaces for human interaction. The interactive dimension of CPS reaches its maximum when defined in terms of natural human-machine interfaces (NHMI), i.e., those reducing the technological barriers required for the interaction. This paper presents a NHMI bringing the human decision-making capabilities inside the cybernetic control loop of a smart manufacturing assembly system. The interface allows to control, coordinate and cooperate with an industrial cobot during the task execution

    PoinTap system: a human-robot interface to enable remotely controlled tasks

    Get PDF
    In the last decades, industrial manipulators have been used to speed up the production process and also to perform tasks that may put humans at risk. Typical interfaces employed to teleoperate the robot are not so intuitive to use. In fact, it takes longer to learn and properly control a robot whose interface is not easy to use, and it may also increase the operator’s stress and mental workload. In this paper, a touchscreen interface for supervised assembly tasks is proposed, using an LCD screen and a hand-tracking sensor. The aim is to provide an intuitive remote controlled system that enables a flexible execution of assembly tasks: high level decisions are entrusted to the human operator while the robot executes pick-and-place operations. A demonstrative industrial case study showcases the system potentiality: it was first tested in simulation, and then experimentally validated using a real robot, in a laboratory environment

    Goal Set Inverse Optimal Control and Iterative Re-planning for Predicting Human Reaching Motions in Shared Workspaces

    Full text link
    To enable safe and efficient human-robot collaboration in shared workspaces it is important for the robot to predict how a human will move when performing a task. While predicting human motion for tasks not known a priori is very challenging, we argue that single-arm reaching motions for known tasks in collaborative settings (which are especially relevant for manufacturing) are indeed predictable. Two hypotheses underlie our approach for predicting such motions: First, that the trajectory the human performs is optimal with respect to an unknown cost function, and second, that human adaptation to their partner's motion can be captured well through iterative re-planning with the above cost function. The key to our approach is thus to learn a cost function which "explains" the motion of the human. To do this, we gather example trajectories from pairs of participants performing a collaborative assembly task using motion capture. We then use Inverse Optimal Control to learn a cost function from these trajectories. Finally, we predict reaching motions from the human's current configuration to a task-space goal region by iteratively re-planning a trajectory using the learned cost function. Our planning algorithm is based on the trajectory optimizer STOMP, it plans for a 23 DoF human kinematic model and accounts for the presence of a moving collaborator and obstacles in the environment. Our results suggest that in most cases, our method outperforms baseline methods when predicting motions. We also show that our method outperforms baselines for predicting human motion when a human and a robot share the workspace.Comment: 12 pages, Accepted for publication IEEE Transaction on Robotics 201
    • …
    corecore