20 research outputs found

    Towards a Characterisation of Emotional Intent During Scripted Scenes Using In-ear Movement Sensors

    Get PDF
    Theatre provides a unique environment in which to obtain detailed data on social interactions in a controlled and repeatable manner.This work introduces a method for capturing and characterising the underlying emotional intent of performers in a scripted sceneusing in-ear accelerometers. Each scene is acted with different underlying emotional intentions using the theatrical technique ofActioning. The goal of the work is to uncover characteristics in the joint movement patterns that reveal information on the positive ornegative valence of these intentions. Preliminary findings over 3x12 (Covid-19 restricted) non-actor trials suggests people are moreenergetic and more in-sync when using positive versus negative intentions

    Towards Respiration Rate Monitoring Using an In-Ear Headphone Inertial Measurement Unit

    Get PDF
    State-of-the-art respiration tracking devices require specialized equipment, making them impractical for every day at-home respiration sensing. In this paper, we present the first system for sensing respiratory rates using in-ear headphone inertial measurement units (IMU). The approach is based on technology already available in commodity devices: the eSense headphones. Our processing pipeline combines several existing approaches to clean noisy data and calculate respiratory rates on 20-second windows. In a study with twelve participants, we compare accelerometer and gyroscope based sensing and employ pressure-based measurement with nasal cannulas as ground truth. Our results indicate a mean absolute error of 2.62 CPM (acc) and 2.55 CPM (gyro). This overall accuracy is comparable to previous approaches using accelerometer-based sensing, but we observe a higher relative error for the gyroscope. In contrast to related work using other sensor positions, we can not report significant differences between the two modalities or the three postures standing, sitting, and lying on the back (supine). However, in general, performance varies drastically between participants

    Design Space and Usability of Earable Prototyping

    Get PDF
    Earable computing gains growing attention within research and becomes ubiquitous in society. However, there is an emerging need for prototyping devices as critical drivers of innovation. In our work, we reviewed the features of existing earable platforms. Based on 24 publications, we characterized the design space of earable prototyping. We used the open eSense platform (6-axis IMU, auditory I/O) to evaluate the problem-based learning usability of non-experts. We collected data from 79 undergraduate students who developed 39 projects. Our questionnaire-based results suggest that the platform creates interest in the subject matter and supports self-directed learning. The projects align with the research space, indicating ease of use, but lack contributions for more challenging topics. Additionally, many projects included games not present in current research. The average SUS score of the platform was 67.0. The majority of problems are technical issues (e.g., connecting, playing music)

    Motion Coupling of Earable Devices in Camera View

    Get PDF
    Earables, earphones augmented with inertial sensors and real-time data accessibility, provide the opportunity for private audio channels in public settings. One of the main challenges of achieving this goal is to correctly associate which device belongs to which user without prior information. In this paper, we explore how motion of an earable, as measured by the on-board accelerometer, can be correlated against detected faces from a webcam to accurately match which user is wearing the device. We conduct a data collection and explore which type of user movement can be accurately detected using this approach, and investigate how varying the speed of the movement affects detection rates. Our results show that the approach achieves greater detection results for faster movements, and that it can differentiate the same movement across different participants with a detection rate of 86%, increasing to 92% when differentiating a movement against others

    OpenEarable:Open Hardware Earable Sensing Platform

    Get PDF
    Earables are ear-worn devices that offer functionalities beyond basic audio in- and output. In this paper we present the ongoing development of a new, open-source, Arduino-based earable platform called OpenEarable. It is based on standard components, is easy to manufacture and costs roughly $40 per device at batch size ten. We present the first version of the device which is equipped with a series of sensors and actuators: a 3-axis accelerometer and gyroscope, an ear canal pressure and temperature sensor, an inward facing ultrasonic microphone as well as a speaker, a push button, and a controllable LED. We demonstrate the versatility of the prototyping platform through three different example application scenarios. In sum, OpenEarable offers a general-purpose, open sensing platform for earable research and development.<br/

    As You Are, So Shall You Move Your Head: A System-Level Analysis between Head Movements and Corresponding Traits and Emotions

    Full text link
    Identifying physical traits and emotions based on system-sensed physical activities is a challenging problem in the realm of human-computer interaction. Our work contributes in this context by investigating an underlying connection between head movements and corresponding traits and emotions. To do so, we utilize a head movement measuring device called eSense, which gives acceleration and rotation of a head. Here, first, we conduct a thorough study over head movement data collected from 46 persons using eSense while inducing five different emotional states over them in isolation. Our analysis reveals several new head movement based findings, which in turn, leads us to a novel unified solution for identifying different human traits and emotions through exploiting machine learning techniques over head movement data. Our analysis confirms that the proposed solution can result in high accuracy over the collected data. Accordingly, we develop an integrated unified solution for real-time emotion and trait identification using head movement data leveraging outcomes of our analysis.Comment: 9 pages, 7 figures, NSysS 201
    corecore