8 research outputs found

    Speaker clustering in multi‐party conversation

    Get PDF
    Proceedings of the 3rd Nordic Symposium on Multimodal Communication. Editors: Patrizia Paggio, Elisabeth Ahlsén, Jens Allwood, Kristiina Jokinen, Costanza Navarretta. NEALT Proceedings Series, Vol. 15 (2011), 56–61. © 2011 The editors and contributors. Published by Northern European Association for Language Technology (NEALT) http://omilia.uio.no/nealt . Electronically published at Tartu University Library (Estonia) http://hdl.handle.net/10062/22532

    Physician gaze shifts in patient-physician interactions:functions, accounts and responses

    Get PDF
    ObjectivesPhysician gaze towards patients is fundamental for medical consultations. Physicians’ use of Electronic Health Records (EHR) affects their gaze towards patients, and may negatively influence this interaction. We aimed to study conversation patterns during gaze shifts of physicians from the patient towards the EHR.MethodsOutpatient consultations (N=8) were eye-tracked. Interactions around physician gaze shifts towards the computer were transcribed.ResultsWe found that physician gaze shifts have different interactional functions, e.g., introducing a topic switch or entering data into the EHR. Furthermore, physicians differ in how they account for their gaze shifts, i.e., both implicitly and explicitly. Third, patients vary in treating the gaze shift as an indication to continue their turn or not.ConclusionsOur results suggest that physician gaze shifts vary in function, in how physicians account for them, and in how they influence the conversation. Future research should take into account distinctions when relating gaze to patient outcomes.Practice implicationsPhysicians may be aware of the interactional context of their gaze behaviour. Patients respond differently to various types of gaze shifts. How physicians handle gaze shifts can therefore have different consequences for the interaction

    Nasal oxytocin administration does not influence eye gaze or perceived relationship of male volunteers with physicians in a simulated online consultation: a randomized, placebo-controlled trial

    Get PDF
    The patient–physician relationship is a critical determinant of patient health outcomes. Verbal and non-verbal communication, such as eye gaze, are vital aspects of this bond. Neurobiological studies indicate that oxytocin may serve as a link between increased eye gaze and social bonding. Therefore, oxytocin signaling could serve as a key factor influencing eye gaze as well as the patient–physician relationsh ip. We aimed to test the effects of oxytocin on gaze to the eyes of the physician and the patient–physician relationship by conducting a randomized placebo-controlled crossover trial in healthy volunteers with intranasally administered oxytocin (with a prev iously effective single dose of 24 IU, EudraCT number 2018-004081-34). The eye gaze of 68 male volunteers was studied using eye tracking during a simulated video call consultation with a physician, who provided information about vaccination against the human papillomavirus. Relationship outcomes, including trust, satisfaction, and perceived physician communication style, were measured using questionnaires and corrected for possible confounds (social anxiety and attachment orientation). Additional secondary outcome measures for the effect of oxytocin were recall of information and pupil diameter and exploratory outcomes included mood and anxiety measures. Oxytocin did not affect the eye-tracking p arameters of volunteers’ gaze toward the eyes of the physician. Moreover, oxytocin did n ot affect the parameters of bonding between volunteers and the physician nor other secondary and exploratory outcomes in this setting. Bayesian hypothesis testing provided evidence for the absence of effects. These results contradict the notion that oxy tocin affects eye gaze patterns or bonding

    An acoustic-phonetic study of retraction of /s/ in Moroccan Dutch and endogenous Dutch

    Get PDF
    In Moroccan Dutch, /s/ has been claimed to be pronounced as retracted [s] (towards /ʃ/) in certain consonant clusters. Recently, retracted s-pronunciation has also been attested in endogenous Dutch. We tested empirically whether Moroccan Dutch [s] is indeed more retracted than endogenous Dutch [s] in relevant clusters. Additionally, we tested whether the inter-speaker variation of /s/ is smaller between Moroccan Dutch speakers than between endogenous Dutch speakers, as expected if retraction of /s/ would be used as identity marker in in-group conversations in Moroccan Dutch. The [s] realizations of 21 young, male Moroccan Dutch and 21 endogenous Dutch speakers were analyzed. Analyses of the spectral centre of gravity (CoG) show that both groups of speakers had more retracted pronunciations of [s] in typically retracting contexts than in typically non-retracting contexts. However, Moroccan Dutch speakers had higher CoG in both contexts than endogenous Dutch speakers, refuting the stronger retraction expected in Moroccan Dutch speakers. The inter-speaker variation was larger between Moroccan Dutch speakers than between endogenous-Dutch speakers, refuting the expected usage of /s/ retraction as a group identity marker.NWOTheoretical and Experimental Linguistic

    Proceedings

    Get PDF
    Proceedings of the 3rd Nordic Symposium on Multimodal Communication. Editors: Patrizia Paggio, Elisabeth Ahlsén, Jens Allwood, Kristiina Jokinen, Costanza Navarretta. NEALT Proceedings Series, Vol. 15 (2011), vi+87 pp. © 2011 The editors and contributors. Published by Northern European Association for Language Technology (NEALT) http://omilia.uio.no/nealt . Electronically published at Tartu University Library (Estonia) http://hdl.handle.net/10062/22532

    Sensing, interpreting, and anticipating human social behaviour in the real world

    Get PDF
    Low-level nonverbal social signals like glances, utterances, facial expressions and body language are central to human communicative situations and have been shown to be connected to important high-level constructs, such as emotions, turn-taking, rapport, or leadership. A prerequisite for the creation of social machines that are able to support humans in e.g. education, psychotherapy, or human resources is the ability to automatically sense, interpret, and anticipate human nonverbal behaviour. While promising results have been shown in controlled settings, automatically analysing unconstrained situations, e.g. in daily-life settings, remains challenging. Furthermore, anticipation of nonverbal behaviour in social situations is still largely unexplored. The goal of this thesis is to move closer to the vision of social machines in the real world. It makes fundamental contributions along the three dimensions of sensing, interpreting and anticipating nonverbal behaviour in social interactions. First, robust recognition of low-level nonverbal behaviour lays the groundwork for all further analysis steps. Advancing human visual behaviour sensing is especially relevant as the current state of the art is still not satisfactory in many daily-life situations. While many social interactions take place in groups, current methods for unsupervised eye contact detection can only handle dyadic interactions. We propose a novel unsupervised method for multi-person eye contact detection by exploiting the connection between gaze and speaking turns. Furthermore, we make use of mobile device engagement to address the problem of calibration drift that occurs in daily-life usage of mobile eye trackers. Second, we improve the interpretation of social signals in terms of higher level social behaviours. In particular, we propose the first dataset and method for emotion recognition from bodily expressions of freely moving, unaugmented dyads. Furthermore, we are the first to study low rapport detection in group interactions, as well as investigating a cross-dataset evaluation setting for the emergent leadership detection task. Third, human visual behaviour is special because it functions as a social signal and also determines what a person is seeing at a given moment in time. Being able to anticipate human gaze opens up the possibility for machines to more seamlessly share attention with humans, or to intervene in a timely manner if humans are about to overlook important aspects of the environment. We are the first to propose methods for the anticipation of eye contact in dyadic conversations, as well as in the context of mobile device interactions during daily life, thereby paving the way for interfaces that are able to proactively intervene and support interacting humans.Blick, Gesichtsausdrücke, Körpersprache, oder Prosodie spielen als nonverbale Signale eine zentrale Rolle in menschlicher Kommunikation. Sie wurden durch vielzählige Studien mit wichtigen Konzepten wie Emotionen, Sprecherwechsel, Führung, oder der Qualität des Verhältnisses zwischen zwei Personen in Verbindung gebracht. Damit Menschen effektiv während ihres täglichen sozialen Lebens von Maschinen unterstützt werden können, sind automatische Methoden zur Erkennung, Interpretation, und Antizipation von nonverbalem Verhalten notwendig. Obwohl die bisherige Forschung in kontrollierten Studien zu ermutigenden Ergebnissen gekommen ist, bleibt die automatische Analyse nonverbalen Verhaltens in weniger kontrollierten Situationen eine Herausforderung. Darüber hinaus existieren kaum Untersuchungen zur Antizipation von nonverbalem Verhalten in sozialen Situationen. Das Ziel dieser Arbeit ist, die Vision vom automatischen Verstehen sozialer Situationen ein Stück weit mehr Realität werden zu lassen. Diese Arbeit liefert wichtige Beiträge zur autmatischen Erkennung menschlichen Blickverhaltens in alltäglichen Situationen. Obwohl viele soziale Interaktionen in Gruppen stattfinden, existieren unüberwachte Methoden zur Augenkontakterkennung bisher lediglich für dyadische Interaktionen. Wir stellen einen neuen Ansatz zur Augenkontakterkennung in Gruppen vor, welcher ohne manuelle Annotationen auskommt, indem er sich den statistischen Zusammenhang zwischen Blick- und Sprechverhalten zu Nutze macht. Tägliche Aktivitäten sind eine Herausforderung für Geräte zur mobile Augenbewegungsmessung, da Verschiebungen dieser Geräte zur Verschlechterung ihrer Kalibrierung führen können. In dieser Arbeit verwenden wir Nutzerverhalten an mobilen Endgeräten, um den Effekt solcher Verschiebungen zu korrigieren. Neben der Erkennung verbessert diese Arbeit auch die Interpretation sozialer Signale. Wir veröffentlichen den ersten Datensatz sowie die erste Methode zur Emotionserkennung in dyadischen Interaktionen ohne den Einsatz spezialisierter Ausrüstung. Außerdem stellen wir die erste Studie zur automatischen Erkennung mangelnder Verbundenheit in Gruppeninteraktionen vor, und führen die erste datensatzübergreifende Evaluierung zur Detektion von sich entwickelndem Führungsverhalten durch. Zum Abschluss der Arbeit präsentieren wir die ersten Ansätze zur Antizipation von Blickverhalten in sozialen Interaktionen. Blickverhalten hat die besondere Eigenschaft, dass es sowohl als soziales Signal als auch der Ausrichtung der visuellen Wahrnehmung dient. Somit eröffnet die Fähigkeit zur Antizipation von Blickverhalten Maschinen die Möglichkeit, sich sowohl nahtloser in soziale Interaktionen einzufügen, als auch Menschen zu warnen, wenn diese Gefahr laufen wichtige Aspekte der Umgebung zu übersehen. Wir präsentieren Methoden zur Antizipation von Blickverhalten im Kontext der Interaktion mit mobilen Endgeräten während täglicher Aktivitäten, als auch während dyadischer Interaktionen mittels Videotelefonie
    corecore