17 research outputs found

    Upgrading legacy equipment to industry 4.0 through a cyber-physical interface

    Get PDF
    With the recent developments of Industry 4.0 technologies, maintenance can be improved significantly by making it “smart”, proactive and even self-aware. This paper introduces a new cutting-edge interfacing technology that enables smart active remote maintenance right on the machine in real-time while allowing integration of smart automated decision making and Industrial Internet of Things to upgrade existing legacy equipment through latest Industry 4.0 technology. This interfacing technology enables remote sensing and actuation access to legacy equipment for smart maintenance by entirely non-intrusive means, i.e. the original equipment does not have to be modified. The design was implemented in a real-world manufacturing environment

    Layers of shared and cooperative control, assistance and automation

    No full text
    Over the last centuries we have experienced scientific, technological and societal progress that enabled the creation of intelligent assisted and automated machines with increasing abilities, and require a conscious distribution of roles and control between humans and machines. Machines can be more than either fully automated or manually controlled, but can work together with the human on different levels of assistance and automation in a hopefully beneficial cooperation. One way of cooperation is that the automation and the human have a shared control over a situation, e.g. a vehicle in an environment. The objective of this paper is to provide a common meta model of shared and cooperative assistance and automation. The meta models based on insight from the H(orse)-methaphor (Flemisch et al, 2003; Goodrich et al., 2006) and Human-Machine Cooperation principles (Hoc and Lemoine, 1998; Pacaux-Lemoine and Debernard, 2002; Pacaux-Lemoine, 2014), are presented and combined in order to propose a framework and criteria to design safe, efficient, ecological and attractive systems. Cooperation is presented from different points of view such as levels of activity (operational, tactical and strategic levels) (Lemoine et al, 1996) as well as the type of function shared between Human and machine (information gathering, information analysis, decision selection, action implementation) (Parasuraman et al., 2000). Examples will be provided in the aviation domain (e.g. Goodrich et. al 2012) and the automotive domain with the automation of driving (Hoeger et al, 2008; Flemisch et al., 2016; Tricot et al., 2004; Pacaux-Lemoine et al, 2004; Pacaux-Lemoine et al., 2015)

    Emulated haptic shared control for brain-computer interfaces improves human-robot cooperation

    No full text
    Today, technology provides many ways for humans to exchange their points of view about pretty much everything. Visual, audio and tactile media are most commonly used by humans, and they support communication in such a natural way that we don’t even actively think about using them. But what about people who have lost motor or sensory capabilities for whom it is difficult or impossible to control or perceive the output of such technologies? In this case, perhaps the only way to communicate might be to use brain signals directly. The goal of this study is therefore towards providing people with tetraplegia, who may be confined to their room or bed, with a telepresence tool that facilitates the daily interactions so many of us take for granted. In our case, the telepresence tool is a robot that is remotely controlled. It can act as a medium for the user in their everyday life with the design of a virtual link with friends and relatives located in remote rooms or places or with different environments to explore. Therefore, the objective is to design a Human-Machine System that enables the control of a robot using thoughts alone. The technological part is composed of a brain-computer interface and a visual interface to implement an “emulated haptic shared control” of the robot. Shared motion control is implemented between the user and the robot as well as an adaptive function allocation to manage the difficulty of the situation. The control schema that exploits this “emulated haptic feedback” has been designed and evaluated using a Human-Machine Cooperation framework and the benefit of this type of interaction has been evaluated with five participants. Initial results indicate better control and cooperation with the “emulated haptic feedback” than without

    Joining the blunt and the pointy end of the spear: Towards a common framework of joint action, human–machine cooperation, cooperative guidance and control, shared, traded and supervisory control

    No full text
    To introduce this special issue of shared and cooperative control, we will look into history of tools in cooperation between humans and aim to unify the plethora of related concepts and definitions that have been proposed in recent years, such as shared control, human–machine cooperation and cooperative guidance and control. Concretely, we provide definitions to relate these concepts and sketch a unifying framework of shared and cooperative control that sees the different concepts as different perspectives or foci on a common design space of shared intentionality, control and cooperation between humans and machines. One working hypothesis which the article explores is that shared control can be understood as cooperation at the control layer, while human–machine cooperation can include shared control, but can also extend towards cooperation at higher layers, e.g., of guidance and navigation, of maneuvers and goals. The relationship between shared control and human–machine cooperation is compared to the relationship between the sharp, pointy tip and the (blunt) shaft of a spear. Shared control is where cooperation comes sharply into effect at the control layer, but to be truly effective it should be supported by cooperation on all layers beyond the operational layer, e.g., on the tactical and strategic layer. A fourth layer addresses the meta-communication about the cooperation and supports the other three layers in a traversal way.Human-Robot Interactio
    corecore