19 research outputs found

    A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction

    Get PDF
    Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting, participants (n = 31) acted as if they were coming home in different emotional states (i.e., stressed, depressed, relaxed, and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages, and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed

    Automatic recognition of touch gestures in the corpus of social touch

    Get PDF
    For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI)

    Automated and unobtrusive measurement of physical activity in an interactive playground

    No full text
    Promoting physical activity is one of the main goals of interactive playgrounds. To validate whether this goal is met, we need to measure the amount of physical player activity. Traditional methods of measuring activity, such as observations or annotations of game sessions, require time and personnel. Others, such as heart rate monitors and accelerometers, need to be worn by the player. In this paper, we investigate whether physical activity can be measured unobtrusively by tracking players using depth cameras and applying computer vision algorithms. In a user study with 32 players, we measure the players’ speed while playing a game of tag, and demonstrate that our measures correlate well with exertion measured using heart rate sensors. This makes the method an attractive alternative to either manual coding or the use of worn devices. We also compare our approach to other exertion measurement methods. Finally, we demonstrate and discuss its potential for automated, unobtrusive measurements and real-time game adaptation

    Automated and unobtrusive measurement of physical activity in an interactive playground

    No full text
    Promoting physical activity is one of the main goals of interactive playgrounds. To validate whether this goal is met, we need to measure the amount of physical player activity. Traditional methods of measuring activity, such as observations or annotations of game sessions, require time and personnel. Others, such as heart rate monitors and accelerometers, need to be worn by the player. In this paper, we investigate whether physical activity can be measured unobtrusively by tracking players using depth cameras and applying computer vision algorithms. In a user study with 32 players, we measure the players’ speed while playing a game of tag, and demonstrate that our measures correlate well with exertion measured using heart rate sensors. This makes the method an attractive alternative to either manual coding or the use of worn devices. We also compare our approach to other exertion measurement methods. Finally, we demonstrate and discuss its potential for automated, unobtrusive measurements and real-time game adaptation

    Play with Me! Gender-Typed Social Play Behavior Analysis in Interactive Tag Games

    No full text
    Promoting social behavior is one of the key goals in interactive games. In this paper, we present an experimental study in the Interactive Tag Playground (ITP) to investigate whether social behaviors reported in literature can also be observed through automated analysis. We do this by analyzing players’ positions and roles, which the ITP logs automatically. Specifically, we address the effect that gender and age have on the amount of tags and the distance that players keep between them. Our findings largely replicate existing research, although not all hypothesized differences reached a level of statistical significance. With this proof-of-concept study, we have paved the way for the automated analysis of play, which can aid in making interactive playgrounds more engaging

    Detecting uncertainty in spoken dialogues: an explorative research to the automatic detection of a speakers' uncertainty by using prosodic markers

    No full text
    This paper reports results in automatic detection of speakers uncertainty in spoken dialogues by using prosodic markers. For this purpose a substantial part of the AMI corpus (a multi-modal multi-party meeting corpus) has been selected and converted to a suitable format so its data could be analyzed for selected prosodic features. In the absence of relevant stance annotations on (un)certainty, lexical markers (hedges) have been used to mark utterances as either certain, or uncertain. Results show that prosodic features can indeed be used to detect speaker uncertainty in spoken dialogues. The classifiers can distinguish uncertain from neutral utterances with an accuracy of 75% which is 25% over the baseline

    Augmenting playspaces to enhance the game experience : A tag game case study

    Get PDF
    Introducing technology into games can improve players’ game experience. However, it can also reduce the amount of physical activity and social interaction. In this article, we discuss how we enhance the game of tag with technology such that physical and social characteristics of the game are retained. We first present an analysis of the behavior of children playing traditional tag games. Based on these observations, we designed the Interactive Tag Playground (ITP), an interactive installation that uses tracking and floor projections to enhance the game of tag. We evaluate the ITP in one user study with adults and one with children. We compare players’ reported experiences when playing both traditional and interactive tag. Players report significantly higher engagement and immersion when playing interactive tag. We also use tracking data collected automatically to quantitatively analyze player behavior in both tag games. Players exhibit similar patterns of physical activity and interactions in both game types. We can therefore conclude that interactive technology can be used to make traditional games more engaging, without losing social and physical character of the game

    Automatic recognition of touch gestures in the corpus of social touch

    No full text
    For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI)

    Project SENSE – Multimodal Simulation with Full-Body Real-Time Verbal and Nonverbal Interactions

    No full text
    This paper presents a multimodal simulation system, project-SENSE, that combines virtual reality and full-body motion capture technologies with real-time verbal and nonverbal communication. We introduce the technical setup and employed hardware and software of a first prototype. We discuss the capabilities of the system for the investigation of cooperation paradoxes and the effects of direct nonverbal mimicry. We argue that this prototype lays the technological basis for further research in interpersonal and social skills, as well as the social and emotional consequences of nonverbal mimicry in sustained interactions

    Augmenting playspaces to enhance the game experience: A tag game case study

    No full text
    Introducing technology into games can improve players’ game experience. However, it can also reduce the amount of physical activity and social interaction. In this article, we discuss how we enhance the game of tag with technology such that physical and social characteristics of the game are retained. We first present an analysis of the behavior of children playing traditional tag games. Based on these observations, we designed the Interactive Tag Playground (ITP), an interactive installation that uses tracking and floor projections to enhance the game of tag. We evaluate the ITP in one user study with adults and one with children. We compare players’ reported experiences when playing both traditional and interactive tag. Players report significantly higher engagement and immersion when playing interactive tag. We also use tracking data collected automatically to quantitatively analyze player behavior in both tag games. Players exhibit similar patterns of physical activity and interactions in both game types. We can therefore conclude that interactive technology can be used to make traditional games more engaging, without losing social and physical character of the game
    corecore