16 research outputs found

    Social interaction in virtual environments: the relationship between mutual gaze, task performance and social presence

    Get PDF
    Everyday face-to-face social interaction is increasingly being supplemented by computer- and video-mediated communication. With mediation, however, comes the potential loss of important non-verbal cues. It is therefore important to attempt to maintain the quality of the mediated interaction, such that it retains as many of the aspects of a real-world interaction as possible. Social presence is a measure of how similar a mediated interaction is to face-to-face, the most socially present situation, in terms of perceptions of and behaviour towards an interlocutor. Social presence can be mediated by many factors, one of which is mutual gaze, and social perceptions of an interlocutor are also thought to be related to task performance. For a successful interaction, therefore, an optimum amount of mutual gaze for maximising social presence and task performance is desirable. This research aims to investigate the relationship between mutual gaze, task performance and social presence, in order to discover the ideal conditions under which a successful mediated interaction can occur. Previous gaze research paradigms have involved one conversational partner staring continuously at the other, and the resulting mutual gaze being measured. It is hypothesised that this method may actually suppress mutual gaze, primarily due to social reasons. It is potentially, therefore, not the optimum experimental design for mutual gaze research. The first study in this thesis used eye-tracking to explore this hypothesis and investigate the relationship between mutual gaze and task performance. A suitable paradigm was developed, based on that used in previous research into eye movements and non-verbal communication. Two participants – Instruction Giver (IG) and Instruction Follower (IF) – communicated via avatars in Second Life to solve simple arithmetic tasks. There were two between-participant looking conditions: staring (the IG’s avatar stared continuously at the IF); and notstaring, (IG’s avatar looked at IF and task-relevant objects). Constant staring did, indeed, show evidence of decreasing mutual gaze within the dyad. Mutual gaze was positively correlated with task performance scores, but only in the not-staring condition. When not engaged in mutual gaze, the IF looked more at task-related objects in the not-staring condition than in the staring condition; this suggests that social factors are likely to be driving the gaze aversion in the staring condition. Furthermore, there are no task-related benefits to staring. The second study explored further how much looking by one person at another will maximise both mutual gaze and task performance between the dyad. It also investigated the relationship between mutual gaze, task performance and both manipulated and perceived social presence. Individual participants interacted with a virtual agent within the Second Life paradigm previously used in the human-human study. Participants were either told they were interacting with a computer (i.e. an agent) or another human (an avatar). This provided the between-participants manipulated social presence variable, or agency. The virtual agent was programmed to look at the participant during either 0%, 25%, 50% or 75% of the interaction, providing the within-participants variable looking condition. The majority of effects were found in the 75% looking condition, including the highest mutual gaze uptake and the highest social presence ratings (measured via a questionnaire). Although the questionnaire did not detect any differences in social presence between the agent and avatar condition, participants were significantly faster to complete the tasks in the avatar condition than in the agent condition. This suggests that behavioural measures may be more effective at detecting differences in social presence than questionnaires alone. The results are discussed in relation to different theories of social interaction. Implications and limitations of the findings are considered and suggestions for future work are made

    Start Making Sense: Predicting confidence in virtual human interactions using biometric signals

    Get PDF
    This is volume 1 of the Measuring Behavior 2020-21 Conference. Volume 2 will follow when the conference takes place in October 2021. www.measuringbehavior.orgPublisher PD

    Don't Look Now: The relationship between mutual gaze, task performance and staring in Second Life

    Full text link
    Mutual gaze is important to social interaction, and can also facilitate task performance. Previous work has assumed that staring at someone maximises mutual gaze. Eye-tracking is used to explore this claim, along with the relationship between mutual gaze and task performance. Two participants – Instruction Giver (IG) and Instruction Follower (IF) – communicated via avatars in Second Life to solve simple arithmetic tasks. There were two conditions: staring (the IG‟s avatar stared continuously at the IF); and not-staring, (IG‟s avatar looked at IF and task-relevant objects). Instead of maximising mutual gaze, constant staring actually showed evidence of decreasing eye contact within the dyad. Mutual gaze was positively correlated with task performance scores, but only in the not-staring condition. When not engaged in mutual gaze, the IF looked more at task-related objects in the notstaring condition than in the staring condition. Implications and possible future work on social interaction are discussed

    Contested Staring: Issues and the use of mutual gaze as an on-line measure of social presence

    Full text link
    Despite many of the current social presence measures relying heavily on subjective post-test questionnaires, some researchers have identified the value of using on-line, behavioural measures. Gaze, and specifically mutual gaze, is known to be related to social perceptions of an interlocutor, as well as facilitating task performance during an interaction [1, 2, 17]. Second Life allows for the investigation of task- based interaction in a highly controllable social environment, whilst simultaneously allowing measurement of eye movements (using a head-mounted eye-tracker). A paradigm for measuring eye movements of a user during interaction with an avatar or agent is presented. The potential for using this paradigm to investigate the use of mutual gaze as an on- line measure of social presence is discussed

    Don't look now:The relationship between mutual gaze, task performance and staring in Second Life

    Get PDF
    Mutual gaze is important to social interaction, and can also\ud facilitate task performance. Previous work has assumed that staring\ud at someone maximises mutual gaze. Eye-tracking is used to\ud explore this claim, along with the relationship between mutual\ud gaze and task performance. Two participants – Instruction Giver\ud (IG) and Instruction Follower (IF) – communicated via avatars in\ud Second Life to solve simple arithmetic tasks. There were two\ud conditions: staring (the IG‟s avatar stared continuously at the IF);\ud and not-staring, (IG‟s avatar looked at IF and task-relevant\ud objects). Instead of maximising mutual gaze, constant staring\ud actually showed evidence of decreasing eye contact within the\ud dyad. Mutual gaze was positively correlated with task performance\ud scores, but only in the not-staring condition. When not engaged in\ud mutual gaze, the IF looked more at task-related objects in the notstaring\ud condition than in the staring condition. Implications and\ud possible future work on social interaction are discussed

    Investigating Human Response, Behaviour, and Preference in Joint-Task Interaction

    No full text
    Human interaction relies on a wide range of signals, including non-verbal cues. In order to develop effective Explainable Planning (XAIP) agents it is important that we understand the range and utility of these communication channels. Our starting point is existing results from joint task interaction and their study in cognitive science. Our intention is that these lessons can inform the design of interaction agents -- including those using planning techniques -- whose behaviour is conditioned on the user's response, including affective measures of the user (i.e., explicitly incorporating the user's affective state within the planning model). We have identified several concepts at the intersection of plan-based agent behaviour and joint task interaction and have used these to design two agents: one reactive and the other partially predictive. We have designed an experiment in order to examine human behaviour and response as they interact with these agents. In this paper we present the designed study and the key questions that are being investigated. We also present the results from an empirical analysis where we examined the behaviour of the two agents for simulated users
    corecore