287 research outputs found
Augmented Exercise Biking with Virtual Environments for Elderly Users:Considerations on the use of auditory feedback
Virtual reality (VR) has been shown to function well as an assistive technology to physical therapy for elderly users. Elderly users, and more specifically retirement home residents, form a unique user group in this field, due to their characteristics and demands. In a case study, retirement home residents used an audio-visual virtual environment (VE) augmentation for an exercise bike. Besides a visual display, a soundscape was played to the subjects using headphones. The soundscape was not no- ticed wand the headphones were found to be obtrusive. In this paper, we consider and discuss possible approaches to alternative auditory and haptic delivery methods for future studies. These nonvisual displays need to fit the requirements and limitations of the retirement home subjects who are to exercise using the VE-based augmenta- tion from the case study
Brotate and Tribike: Designing Smartphone Control for Cycling
The more people commute by bicycle, the higher is the number of cyclists
using their smartphones while cycling and compromising traffic safety. We have
designed, implemented and evaluated two prototypes for smartphone control
devices that do not require the cyclists to remove their hands from the
handlebars - the three-button device Tribike and the rotation-controlled
Brotate. The devices were the result of a user-centred design process where we
identified the key features needed for a on-bike smartphone control device. We
evaluated the devices in a biking exercise with 19 participants, where users
completed a series of common smartphone tasks. The study showed that Brotate
allowed for significantly more lateral control of the bicycle and both devices
reduced the cognitive load required to use the smartphone. Our work contributes
insights into designing interfaces for cycling.Comment: 22nd International Conference on Human-Computer Interaction with
Mobile Devices and Services (MobileHCI '20), October 5--8, 2020, Oldenburg,
German
Supporting Eyes-Free Human–Computer Interaction with Vibrotactile Haptification
The sense of touch is a crucial sense when using our hands in complex tasks. Some tasks we learn to do even without sight by just using the sense of touch in our fingers and hands. Modern touchscreen devices, however, have lost some of that tactile feeling while removing physical controls from the interaction. Touch is also a sense that is underutilized in interactions with technology and could provide new ways of interaction to support users. While users are using information technology in certain situations, they cannot visually and mentally focus completely during the interaction.
Humans can utilize their sense of touch more comprehensively in interactions and learn to understand tactile information while interacting with information technology. This thesis introduces a set of experiments that evaluate human capabilities to understand and notice tactile information provided by current actuator technology and further introduces a couple of examples of haptic user interfaces (HUIs) to use under eyes-free use scenarios. These experiments evaluate the benefits of such interfaces for users and concludes with some guidelines and methods for how to create this kind of user interfaces.
The experiments in this thesis can be divided into three groups. In the first group, with the first two experiments, the detection of vibrotactile stimuli and interpretation of the abstract meaning of vibrotactile feedback was evaluated. Experiments in the second group evaluated how to design rhythmic vibrotactile tactons to be basic vibrotactile primitives for HUIs. The last group of two experiments evaluated how these HUIs benefit the users in the distracted and eyes-free interaction scenarios.
The primary aim for this series of experiments was to evaluate if utilizing the current level of actuation technology could be used more comprehensively than in current-day solutions with simple haptic alerts and notifications. Thus, to find out if the comprehensive use of vibrotactile feedback in interactions would provide additional benefits for the users, compared to the current level of haptic interaction methods and nonhaptic interaction methods.
The main finding of this research is that while using more comprehensive HUIs in eyes-free distracted-use scenarios, such as while driving a car, the user’s main task, driving, is performed better. Furthermore, users liked the comprehensively haptified user interfaces
Electrotactile Communication via Matrix Electrode Placed on the Torso Using Fast Calibration, and Static vs. Dynamic Encoding
Electrotactile stimulation is a technology that reproducibly elicits tactile sensations and can be used as an alternative channel to communicate information to the user. The presented work is a part of an effort to develop this technology into an unobtrusive communication tool for first responders. In this study, the aim was to compare the success rate (SR) between discriminating stimulation at six spatial locations (static encoding) and recognizing six spatio-temporal patterns where pads are activated sequentially in a predetermined order (dynamic encoding). Additionally, a procedure for a fast amplitude calibration, that includes a semi-automated initialization and an optional manual adjustment, was employed and evaluated. Twenty subjects, including twelve first responders, participated in the study. The electrode comprising the 3 Ă— 2 matrix of pads was placed on the lateral torso. The results showed that high SRs could be achieved for both types of message encoding after a short learning phase; however, the dynamic approach led to a statistically significant improvement in messages recognition (SR of 93.3%), compared to static stimulation (SR of 83.3%). The proposed calibration procedure was also effective since in 83.8% of the cases the subjects did not need to adjust the stimulation amplitude manually
- …