364 research outputs found

    Calm Displays and Their Applications : Making Emissive Displays Mimic Reflective Surfaces Using Visual Psychophysics, Light Sensing and Colour Science

    Get PDF
    Ph. D. Thesis.Our environment is increasingly full of obtrusive display panels, which become illuminating surfaces when on, and void black rectangles when off. Some researchers argue that emissive displays are incompatible with Weiser and Seely Brown's vision of "calm technology", due to their inability to seamlessly blend into the background. Indeed, Mankoff has shown that for any ambient technology, the ability to move into the periphery is the most relevant factor in their usability. In this thesis, a background mode for displays is proposed based on the idea that displays can look like an ordinary piece of reflective paper showing the same content. The thesis consists of three main parts. In the first part (Chapter 4), human colour matching performance between an emissive display and reflective paper under chromatic lighting conditions is measured in a psychophysical experiment. We find that threshold discrimination ellipses vary with condition (16.0×6.0 ΔEab on average), with lower sensitivity to chroma than hue changes. Match distributions are bimodal for some conditions. In the second part (Chapter 5), an algorithm enabling emissive displays to look like reflective paper is described and evaluated, giving an average error of ΔEab = 10.2 between display and paper. A field study showed that paper-like displays are more acceptable in bedrooms and that people are more likely to keep them always on than normal displays. Finally, the third part (Chapter 6) concerns the development and four-week trial of a paper-like display application. Using the autobiographical design method, a system for sharing bedtime with a remote partner was developed. We see that once unobtrusive, display systems are desired for use even in spaces like bedrooms. Paper-like displays enable both emerging and existing devices to move into the periphery and become “invisible”, and therefore provide a new building block of calm technology that is not achievable using simple emissive displays

    ViBreathe: Heart Rate Variability Enhanced Respiration Training for Workaday Stress Management via an Eyes-free Tangible Interface

    Get PDF
    Slow breathing guiding applications increasingly emerge, showing promise for helping knowledge workers to better cope with workaday stress. However, standard breathing guidance is non-interactive, with rigid paces. Despite their effects being proved, they could cause respiratory fatigue, or lack of training motivation, especially for novice users. To explore new design possibilities, we investigate using heart rate variability (HRV) data to mediate breathing guidance, which results in two HRV-enhanced guidance modes: (i) responsive breathing guidance and (ii) adaptive breathing guidance. These guidance modes are implemented on a soft haptic interface named “ViBreathe”. We conducted a user test (N\ua0=\ua024), and a one-week field deployment (N\ua0=\ua04) with knowledge workers, to understand the user experience of our design. The HRV-enhanced modes were generally experienced to reduce tiresome and improve engagement and comfort. And Vibreathe showed great potential for seamlessly weaving slow breathing practice into work routines. We thereby summarize related design insights and opportunities

    Challenges and Opportunities for the Design of Smart Speakers

    Full text link
    Advances in voice technology and voice user interfaces (VUIs) -- such as Alexa, Siri, and Google Home -- have opened up the potential for many new types of interaction. However, despite the potential of these devices reflected by the growing market and body of VUI research, there is a lingering sense that the technology is still underused. In this paper, we conducted a systematic literature review of 35 papers to identify and synthesize 127 VUI design guidelines into five themes. Additionally, we conducted semi-structured interviews with 15 smart speaker users to understand their use and non-use of the technology. From the interviews, we distill four design challenges that contribute the most to non-use. Based on their (non-)use, we identify four opportunity spaces for designers to explore such as focusing on information support while multitasking (cooking, driving, childcare, etc), incorporating users' mental models for smart speakers, and integrating calm design principles.Comment: 15 pages, 7 figure

    Shared User Interfaces of Physiological Data: Systematic Review of Social Biofeedback Systems and Contexts in HCI

    Get PDF
    As an emerging interaction paradigm, physiological computing is increasingly being used to both measure and feed back information about our internal psychophysiological states. While most applications of physiological computing are designed for individual use, recent research has explored how biofeedback can be socially shared between multiple users to augment human-human communication. Reflecting on the empirical progress in this area of study, this paper presents a systematic review of 64 studies to characterize the interaction contexts and effects of social biofeedback systems. Our findings highlight the importance of physio-temporal and social contextual factors surrounding physiological data sharing as well as how it can promote social-emotional competences on three different levels: intrapersonal, interpersonal, and task-focused. We also present the Social Biofeedback Interactions framework to articulate the current physiological-social interaction space. We use this to frame our discussion of the implications and ethical considerations for future research and design of social biofeedback interfaces.Comment: [Accepted version, 32 pages] Clara Moge, Katherine Wang, and Youngjun Cho. 2022. Shared User Interfaces of Physiological Data: Systematic Review of Social Biofeedback Systems and Contexts in HCI. In CHI Conference on Human Factors in Computing Systems (CHI'22), ACM, https://doi.org/10.1145/3491102.351749

    Towards Affective Chronometry:Exploring Smart Materials and Actuators for Real-time Representations of Changes in Arousal

    Get PDF
    Increasing HCI work on affective interfaces aimed to capture and communicate users’ emotions in order to support self-understanding. While most such interfaces employ traditional screen-based displays, more novel approaches have started to investigate smart materials and actuators based prototypes. In this paper, we describe our exploration of smart materials and actuators leveraging their temporal qualities as well as common metaphors for real-time representation of changes in arousal through visual and haptic modalities. This exploration provided rationale for the design and implementation of six novel wrist-worn prototypes evaluated with 12 users who wore them over 2 days. Our findings describe how people use them in daily life, and how their material-driven qualities such as responsiveness, duration, rhythm, inertia, aliveness and range shape people’s emotion identification, attribution, and regulation. Our findings led to four design implications including support for affective chronometry for both raise and decay time of emotional response, design for slowness, and for expressiveness

    Towards a Better Understanding of Emotion Communication in Music: An Interactive Production Approach.

    Get PDF
    It has been well established that composers and performers are able to encode certain emotional expressions in music, which in turn are decoded by listeners, and in general, successfully recognised. There is still much to discover, however, as to how musical cues combine to shape different emotions in the music, since previous literature has tended to focus on a limited number of cues and emotional expressions. The work in this thesis aims to investigate how combinations of tempo, articulation, pitch, dynamics, brightness, mode, and later, instrumentation, are used to shape sadness, joy, calmness, anger, fear, power, and surprise in Western tonal music. In addition, new tools for music and emotion research are presented with the aim of providing an efficient production approach to explore a large cue-emotion space in a relatively short time. To this end, a new interactive interface called EmoteControl was created which allows users to alter musical pieces in real-time through the available cues. Moreover, musical pieces were specifically composed to be used as stimuli. Empirical experiments were then carried out with the interface to determine how participants shaped different emotions in the pieces using the available cues. Specific cue combinations for the different emotions were produced. Findings revealed that overall, mode and tempo were the strongest contributors to the conveyed emotion whilst brightness was the least effective cue. However, the importance of the cues varied depending on the intended emotion. Finally, a comparative evaluation of production and traditional approaches was carried out which showed that similar results may be obtained with both. However, the production approach allowed for a larger cue-emotion space to be navigated in a shorter time. In sum, the production approach allowed participants to directly show us how they think emotional expressions should sound, and how they are shaped in music
    • …
    corecore