3 research outputs found

    Jak uspołecznić robota: Organizacja przestrzenna i multimodalne interakcje semiotyczne w laboratorium robotyki społecznej

    Get PDF
    Badacze reprezentujący robotykę społeczną projektują swoje roboty tak, aby funkcjonowały one jako społeczni agenci w interakcji z ludźmi oraz z innymi robotami. Jakkolwiek nie przeczymy, że fizyczne cechy robota oraz jego oprogramowanie są istotne dla osiągniecia tego celu, pragniemy zwrócić uwagę na znaczenie organizacji przestrzennej oraz procesów koordynacji interakcji robota z ludźmi. Interakcje te badaliśmy, prowadząc obserwacje w [„rozszerzonym”] laboratorium robotyki społecznej. W tekście dokonujemy multimodalnej analizy interakcyjnej dwóch momentów praktyki projektantów robotów społecznych. Opisujemy kluczową rolę samych robotyków oraz grupy małych dzieci nieposługujących się jeszcze językiem, które zaangażowano w proces projektowania robota. Twierdzimy tu, że społeczny charakter projektowanej maszyny jest w istotny sposób powiązany z subtelnością ludzkich zachowań w laboratorium. To ludzkie zaangażowanie w proces tworzenia społecznego sprawstwa robota nie jest kwestią woli indywidualnych osób. Raczej jest tak, że dopasowania maszyn i ludzi wymaga dynamika sytuacyjna, w której osadzony jest robot

    Investigating the influence of situations and expectations on user behavior : empirical analyses in human-robot interaction

    Get PDF
    Lohse M. Investigating the influence of situations and expectations on user behavior : empirical analyses in human-robot interaction. Bielefeld (Germany): Bielefeld University; 2010.Social sciences are becoming increasingly important for robotics research as work goes on to enable service robots to interact with inexperienced users. This endeavor can only be successful if the robots learn to interpret the users' behavior reliably and, in turn, provide feedback for the users, which enables them to understand the robot. In order to achieve this goal, the thesis introduces an approach to describe the interaction situation as a dynamic construct with different levels of specificity. The situation concept is the starting point for a model which aims to explain the users' behavior. The second important component of the model is the expectations of the users with respect to the robot. Both the situation and the expectations are shown to be the main determinants of the users' behaviors. With this theoretical background in mind, the thesis examines interactions from a home tour scenario in which a human teaches a robot about rooms and objects within them. To analyze the human expectations and behaviors in this situation, two main novel methods have been developed. In particular, a quantitative method for the analysis of the users' behavior repertoires (speech, gesture, eye gaze, body orientation, etc.) is introduced. The approach focuses on the interaction level, which describes the interplay between the robot and the user. In the second novel method, also the system level is taken into account, which includes the robot components and their interplay. This method serves for a detailed task analysis and helps to identify problems that occur in the interaction. By applying these methods, the thesis contributes to the identification of underlying expectations that allow future behavior of the users to be predicted in particular situations. Knowledge about the users' behavior repertoires serves as a cue for the robot about the state of the interaction and the task the users aim to accomplish. Therefore, it enables robot developers to adapt the interaction models of the components to the situation, actual user expectations, and behaviors. The work provides a deeper understanding of the role of expectations in human-robot interaction and contributes to the interaction and system design of interactive robots

    Automatic extraction of constraints in manipulation tasks for autonomy and interaction

    Get PDF
    Tasks routinely executed by humans involve sequences of actions performed with high dexterity and coordination. Fully specifying these actions such that a robot could replicate the task is often difficult. Furthermore the uncertainties introduced by the use of different tools or changing configurations demand the specification to be generic, while enhancing the important task aspects, i.e. the constraints. Therefore the first challenge of this thesis is inferring these constraints from repeated demonstrations. In addition humans explaining a task to another person rely on the person's ability to apprehend missing or implicit information. Therefore observations contain user-specific cues, alongside knowledge on performing the task. Thus our second challenge is correlating the task constraints with the user behavior for improving the robot's performance. We address these challenges using a Programming by Demonstration framework. In the first part of the thesis we describe an approach for decomposing demonstrations into actions and extracting task-space constraints as continuous features that apply throughout each action. The constraints consist of: (1) the reference frame for performing manipulation, (2) the variables of interest relative to this frame, allowing a decomposition in force and position control, and (3) a stiffness gain modulating the contribution of force and position. We then extend this approach to asymmetrical bimanual tasks by extracting features that enable arm coordination: the master--slave role that enables precedence, and the motion--motion or force--motion coordination that facilitates the physical interaction through an object. The set of constraints and the time-independent encoding of each action form a task prototype, used to execute the task. In the second part of the thesis we focus on discovering additional features implicit in the demonstrations with respect to two aspects of the teaching interactions: (1) characterizing the user performance and (2) improving the user behavior. For the first goal we assess the skill of the user and implicitly the quality of the demonstrations by using objective task--specific metrics, related directly to the constraints. We further analyze ways of making the user aware of the robot's state during teaching by providing task--related feedback. The feedback has a direct influence on both the teaching efficiency and the user's perception of the interaction. We evaluated our approaches on robotic experiments that encompass daily activities using two 7 degrees of freedom Kuka LWR robotic arms, and a 53 degrees of freedom iCub humanoid robot
    corecore