89,353 research outputs found
Recommended from our members
Emphatic agents to reduce user frustration: The effects of varying agent characteristics
There is now growing interest in the development of computer systems which respond to usersâ emotion and affect. We report three small scale studies (with a total of 42 participants) which investigate the extent to which affective agents, using strategies derived from human-human interaction, can reduce user frustration within human-computer interaction. The results confirm the previous findings of Klein et al (2002) that such interventions can be effective. We also obtained results that suggest that embodied agents can be more effective at reducing frustration than non-embodied agents, and that female embodied agents may be more effective than male embodied agents. These results are discussed in light of the existing research literature
Academic motherhood and fieldwork: Juggling time, emotions and competing demands
The idea and practice of going âinto the fieldâ to conduct research and gather data is a deeply rooted aspect of Geography as a discipline. For global North Development Geographers, amongst others, this usually entails travelling to, and spending periods of time in, often far-flung parts of the global South. Forging a successful academic career as a Development Geographer in the UK, is therefore to some extent predicated on mobility. This paper aims to critically engage with the gendered aspects of this expected mobility, focusing on the challenges and time constraints that are apparent when conducting overseas fieldwork as a mother, unaccompanied by her children. The paper emphasises the emotion work that is entailed in balancing the competing demands of overseas fieldwork and mothering, and begins to think through the implications of these challenges in terms of the types of knowledge we produce, as well as in relation to gender equality within the academy
Dissociation and interpersonal autonomic physiology in psychotherapy research: an integrative view encompassing psychodynamic and neuroscience theoretical frameworks
Interpersonal autonomic physiology is an interdisciplinary research field, assessing the relational interdependence of two (or more) interacting individual both at the behavioral and psychophysiological levels. Despite its quite long tradition, only eight studies since 1955 have focused on the interaction of psychotherapy dyads, and none of them have focused on the shared processual level, assessing dynamic phenomena such as dissociation. We longitudinally observed two brief psychodynamic psychotherapies, entirely audio and video-recorded (16 sessions, weekly frequency, 45 min.). Autonomic nervous system measures were continuously collected during each session. Personality, empathy, dissociative features and clinical progress measures were collected prior and post therapy, and after each clinical session. Two-independent judges, trained psychotherapist, codified the interactions\u2019 micro-processes. Time-series based analyses were performed to assess interpersonal synchronization and de-synchronization in patient\u2019s and therapist\u2019s physiological activity. Psychophysiological synchrony revealed a clear association with empathic attunement, while desynchronization phases (range of length 30-150 sec.) showed a linkage with dissociative processes, usually associated to the patient\u2019s narrative core relational trauma. Our findings are discussed under the perspective of psychodynamic models of Stern (\u201cpresent moment\u201d), Sander, Beebe and Lachmann (dyad system model of interaction), Lanius (Trauma model), and the neuroscientific frameworks proposed by Thayer (neurovisceral integration model), and Porges (polyvagal theory). The collected data allows to attempt an integration of these theoretical approaches under the light of Complex Dynamic Systems. The rich theoretical work and the encouraging clinical results might represents a new fascinating frontier of research in psychotherapy
I hear you eat and speak: automatic recognition of eating condition and food type, use-cases, and impact on ASR performance
We propose a new recognition task in the area of computational paralinguistics: automatic recognition of eating conditions in speech, i. e., whether people are eating while speaking, and what they are eating. To this end, we introduce the audio-visual iHEARu-EAT database featuring 1.6 k utterances of 30 subjects (mean age: 26.1 years, standard deviation: 2.66 years, gender balanced, German speakers), six types of food (Apple, Nectarine, Banana, Haribo Smurfs, Biscuit, and Crisps), and read as well as spontaneous speech, which is made publicly available for research purposes. We start with demonstrating that for automatic speech recognition (ASR), it pays off to know whether speakers are eating or not. We also propose automatic classification both by brute-forcing of low-level acoustic features as well as higher-level features related to intelligibility, obtained from an Automatic Speech Recogniser. Prediction of the eating condition was performed with a Support Vector Machine (SVM) classifier employed in a leave-one-speaker-out evaluation framework. Results show that the binary prediction of eating condition (i. e., eating or not eating) can be easily solved independently of the speaking condition; the obtained average recalls are all above 90%. Low-level acoustic features provide the best performance on spontaneous speech, which reaches up to 62.3% average recall for multi-way classification of the eating condition, i. e., discriminating the six types of food, as well as not eating. The early fusion of features related to intelligibility with the brute-forced acoustic feature set improves the performance on read speech, reaching a 66.4% average recall for the multi-way classification task. Analysing features and classifier errors leads to a suitable ordinal scale for eating conditions, on which automatic regression can be performed with up to 56.2% determination coefficient
- âŠ