6 research outputs found

    Social cognition and robotics

    Get PDF
    In our social world we continuously display nonverbal behavior during interaction. Particularly, when meeting for the first time we use these implicit signals to form judgments about each other, which is a cornerstone of cooperation and societal cohesion. The aim of the studies presented here was to examine which gaze patterns as well as other types of nonverbal signals, such as facial expressions, gestures and kinesics are presented during interaction, which signals are preferred, and which signals we base our social judgment on. Furthermore, it was investigated whether cultural context, of German or Japanese culture, influences these interaction and decision making patterns. One part of the following dissertation concerned itself mainly with gaze behavior as it is one of the most important tools humans use to function in the natural world. It allows monitoring the environment as well as signalling towards others. Thus, measuring whether attentional resources are captured by examining potential gaze following in reaction to pointing gestures and gaze shifts of an interaction partner was one of the goals of this dissertation. However, also intercultural differences in gaze reaction towards direct gaze during various types of interaction were examined. For that purpose, a real-world dyadic interaction scenario in combination with a mobile eyetracker was used. Evidence of gaze patterns suggested that independent of culture interactants seem to mostly ignore irrelevant directional cues and instead remain focused on the face of a conversation partner, at least while listening to said partner. This was a pattern also repeated when no displays of directional signals were performed. While speaking, on the other hand, interactants from Japan seem to change their behaviour, in contrast to interactants from Germany, as they avert their gaze away from the face, which may be attributed to cultural norms. As correct assessment of another person is a critical skill for humans to possess the second part of the presented dissertation investigated on which basis humans make these social decisions. Specifically, nonverbal signals of trustworthiness and potential cooperativeness in Germany and in Japan were of interest. Thus, in one study a mobile eyetracker was used to investigate intercultural differences in gaze patterns during the social judgment process of a small number of sequentially presented potential cooperation partner. In another study participants viewed video stimuli of faces, bodies and faces + bodies of potential cooperation partner to examine the basis of social decision making in more detail and also to explore a wider variety of nonverbal behaviours in a more controlled manner. Results indicated that while judging presenters on trustworthiness based on displayed nonverbal cues German participants seem to partly look away from the face and examine the body. This is behavior in contrast to Japanese participants who seem to remain fixated mostly on the face. Furthermore, it was shown that body motion may be of particular importance for social judgment and that body motion of one’s own culture as opposed to a different culture seems to be preferred. Lastly, nonverbal signals as a basis of decision making were explored in more detail by examining the preferred interaction partner’s behaviour presented as video stimuli. In recent years and presumably also in the future, the human social environment has been growing to include new types of interactants, such as robots. To therefore ensure a smooth interaction, robots need to be adjusted according to human social expectation, including their nonverbal behavior. That is one of the reasons why all results presented here were not only put in the context of human interaction and judgment, but also viewed in the context of human-robot interaction

    Social cognition and robotics

    Get PDF
    In our social world we continuously display nonverbal behavior during interaction. Particularly, when meeting for the first time we use these implicit signals to form judgments about each other, which is a cornerstone of cooperation and societal cohesion. The aim of the studies presented here was to examine which gaze patterns as well as other types of nonverbal signals, such as facial expressions, gestures and kinesics are presented during interaction, which signals are preferred, and which signals we base our social judgment on. Furthermore, it was investigated whether cultural context, of German or Japanese culture, influences these interaction and decision making patterns. One part of the following dissertation concerned itself mainly with gaze behavior as it is one of the most important tools humans use to function in the natural world. It allows monitoring the environment as well as signalling towards others. Thus, measuring whether attentional resources are captured by examining potential gaze following in reaction to pointing gestures and gaze shifts of an interaction partner was one of the goals of this dissertation. However, also intercultural differences in gaze reaction towards direct gaze during various types of interaction were examined. For that purpose, a real-world dyadic interaction scenario in combination with a mobile eyetracker was used. Evidence of gaze patterns suggested that independent of culture interactants seem to mostly ignore irrelevant directional cues and instead remain focused on the face of a conversation partner, at least while listening to said partner. This was a pattern also repeated when no displays of directional signals were performed. While speaking, on the other hand, interactants from Japan seem to change their behaviour, in contrast to interactants from Germany, as they avert their gaze away from the face, which may be attributed to cultural norms. As correct assessment of another person is a critical skill for humans to possess the second part of the presented dissertation investigated on which basis humans make these social decisions. Specifically, nonverbal signals of trustworthiness and potential cooperativeness in Germany and in Japan were of interest. Thus, in one study a mobile eyetracker was used to investigate intercultural differences in gaze patterns during the social judgment process of a small number of sequentially presented potential cooperation partner. In another study participants viewed video stimuli of faces, bodies and faces + bodies of potential cooperation partner to examine the basis of social decision making in more detail and also to explore a wider variety of nonverbal behaviours in a more controlled manner. Results indicated that while judging presenters on trustworthiness based on displayed nonverbal cues German participants seem to partly look away from the face and examine the body. This is behavior in contrast to Japanese participants who seem to remain fixated mostly on the face. Furthermore, it was shown that body motion may be of particular importance for social judgment and that body motion of one’s own culture as opposed to a different culture seems to be preferred. Lastly, nonverbal signals as a basis of decision making were explored in more detail by examining the preferred interaction partner’s behaviour presented as video stimuli. In recent years and presumably also in the future, the human social environment has been growing to include new types of interactants, such as robots. To therefore ensure a smooth interaction, robots need to be adjusted according to human social expectation, including their nonverbal behavior. That is one of the reasons why all results presented here were not only put in the context of human interaction and judgment, but also viewed in the context of human-robot interaction

    Robots As Intentional Agents: Using Neuroscientific Methods to Make Robots Appear More Social

    Get PDF
    Robots are increasingly envisaged as our future cohabitants. However, while considerable progress has been made in recent years in terms of their technological realization, the ability of robots to interact with humans in an intuitive and social way is still quite limited. An important challenge for social robotics is to determine how to design robots that can perceive the user’s needs, feelings, and intentions, and adapt to users over a broad range of cognitive abilities. It is conceivable that if robots were able to adequately demonstrate these skills, humans would eventually accept them as social companions. We argue that the best way to achieve this is using a systematic experimental approach based on behavioral and physiological neuroscience methods such as motion/eye-tracking, electroencephalography, or functional near-infrared spectroscopy embedded in interactive human–robot paradigms. This approach requires understanding how humans interact with each other, how they perform tasks together and how they develop feelings of social connection over time, and using these insights to formulate design principles that make social robots attuned to the workings of the human brain. In this review, we put forward the argument that the likelihood of artificial agents being perceived as social companions can be increased by designing them in a way that they are perceived as intentional agents that activate areas in the human brain involved in social-cognitive processing. We first review literature related to social-cognitive processes and mechanisms involved in human–human interactions, and highlight the importance of perceiving others as intentional agents to activate these social brain areas. We then discuss how attribution of intentionality can positively affect human–robot interaction by (a) fostering feelings of social connection, empathy and prosociality, and by (b) enhancing performance on joint human–robot tasks. Lastly, we describe circumstances under which attribution of intentionality to robot agents might be disadvantageous, and discuss challenges associated with designing social robots that are inspired by neuroscientific principles

    Humans are Well Tuned to Detecting Agents Among Non-agents: Examining the Sensitivity of Human Perception to Behavioral Characteristics of Intentional Systems

    No full text
    For efficient social interactions, humans have developed means to predict and understand others' behavior often with reference to intentions and desires. To infer others' intentions, however, one must assume that the other is an agent with a mind and mental states. With two experiments, this study examined if the human perceptual system is sensitive to detecting human agents, based on only subtle behavioral cues. Participants observed robots, which performed pointing gestures interchangeably to the left or right with one of their two arms. Onset times of the pointing movements could have been pre-programmed, human-controlled (Experiment 1), or modeled after a human behavior (Experiment 2). The task was to determine if the observed behavior was controlled by a human or by a computer program, without any information about what parameters of behavior this judgment should be based on. Results showed that participants were able to detect human behavior above chance in both experiments. Moreover, participants were asked to discriminate a letter (F/T) presented on the left or the right side of a screen. The letter could have been either validly cued by the robot (location that the robot pointed to coincided with the location of the letter) or invalidly cued (the robot pointed to the opposite location than the letter was presented). In this cueing task, target discrimination was better for the valid versus invalid conditions in Experiment 1 where a human face was presented centrally on a screen throughout the experiment. This effect was not significant in Experiment 2 where participants were exposed only to a robotic face. In sum, present results show that the human perceptual system is sensitive to subtleties of human behavior. Attending to where others attend, however, is modulated not only by adopting the Intentional Stance but also by the way participants interpret the observed stimuli.German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) Grant awarded to AW (WY-122/1-1) and a gran twithin the LMU Excellent scheme awarded to AWScopu
    corecore