1,322 research outputs found

    Using degraded music quality to encourage a health improving walking pace : BeatClearWalker

    Get PDF
    Meeting the target of 8000 steps/day, as recommended by many national governments and health authorities, can provide considerable physical and mental health benefits and is seen as a key target for reducing obesity levels and improving public health. However, to optimize the health benefits, walking should be performed at a “moderate” intensity. While there are numerous mobile fitness applications that monitor distance walked, none directly support walking at this cadence nor has there been any research into live feedback for walking cadence. We present a smartphone fitness application to help users learn how to walk at a moderate cadence and maintain that cadence. We apply real-time audio effects that diminish the audio quality of music when the target walking cadence is not being reached. This provides an immersive and intuitive application that can easily be integrated into everyday life as allows users to walk while listening to their own music and encourages eyes-free interaction. In this paper, we introduce our approach, design, initial lab evaluation and a controlled outdoor study. Results show that using music degradation decreases the number of below-cadence steps, that users felt they worked harder with our player and would use it while exercise walking

    Using degraded music quality to encourage a health improving walking pace : BeatClearWalker

    Get PDF
    Meeting the target of 8000 steps/day, as recommended by many national governments and health authorities, can provide considerable physical and mental health benefits and is seen as a key target for reducing obesity levels and improving public health. However, to optimize the health benefits, walking should be performed at a "moderate" intensity. While there are numerous mobile fitness applications that monitor distance walked, none directly support walking at this cadence nor has there been any research into live feedback for walking cadence. We present a smartphone fitness application to help users learn how to walk at a moderate cadence and maintain that cadence. We apply real-time audio effects that diminish the audio quality of music when the target walking cadence is not being reached. This provides an immersive and intuitive application that can easily be integrated into everyday life as allows users to walk while listening to their own music and encourages eyes-free interaction. In this paper, we introduce our approach, design, initial lab evaluation and a controlled outdoor study. Results show that using music degradation decreases the number of below-cadence steps, that users felt they worked harder with our player and would use it while exercise walking

    Mobile advertising effectiveness versus PC and TV using consumer neuroscience

    Get PDF
    This Doctoral Thesis, entitled Mobile Advertising Effectiveness versus PC and TV, Using Consumer Neuroscience, while analyzes both the evolution of mobile advertising and its current situation, also discusses, how effective is mobile advertising when compared against advertising in other digital devices, such as PC and TV. The last few years have been characterized by an increase of the time that consumers spend on their mobile phones and as a result, by an increase in the expending on digital mobile advertising. Brands are already demanding models that measure digital advertising effectiveness, and consumer neuroscience technology may help, not only to measure it, but also to understand its impact on consumers. Considering this environment, this research proposes various recommendations for advertisers that may be considering using consumer neuroscience technology to measure mobile advertising effectiveness, as well as recommendations on how to design mobile ads that increase advertising effectiveness

    ViBreathe: Heart Rate Variability Enhanced Respiration Training for Workaday Stress Management via an Eyes-free Tangible Interface

    Get PDF
    Slow breathing guiding applications increasingly emerge, showing promise for helping knowledge workers to better cope with workaday stress. However, standard breathing guidance is non-interactive, with rigid paces. Despite their effects being proved, they could cause respiratory fatigue, or lack of training motivation, especially for novice users. To explore new design possibilities, we investigate using heart rate variability (HRV) data to mediate breathing guidance, which results in two HRV-enhanced guidance modes: (i) responsive breathing guidance and (ii) adaptive breathing guidance. These guidance modes are implemented on a soft haptic interface named “ViBreathe”. We conducted a user test (N\ua0=\ua024), and a one-week field deployment (N\ua0=\ua04) with knowledge workers, to understand the user experience of our design. The HRV-enhanced modes were generally experienced to reduce tiresome and improve engagement and comfort. And Vibreathe showed great potential for seamlessly weaving slow breathing practice into work routines. We thereby summarize related design insights and opportunities

    INNOVATING CONTROL AND EMOTIONAL EXPRESSIVE MODALITIES OF USER INTERFACES FOR PEOPLE WITH LOCKED-IN SYNDROME

    Get PDF
    Patients with Lock-In-Syndrome (LIS) lost their ability to control any body part beside their eyes. Current solutions mainly use eye-tracking cameras to track patients' gaze as system input. However, despite the fact that interface design greatly impacts user experience, only a few guidelines have been were proposed so far to insure an easy, quick, fluid and non-tiresome computer system for these patients. On the other hand, the emergence of dedicated computer software has been greatly increasing the patients' capabilities, but there is still a great need for improvements as existing systems still present low usability and limited capabilities. Most interfaces designed for LIS patients aim at providing internet browsing or communication abilities. State of the art augmentative and alternative communication systems mainly focus on sentences communication without considering the need for emotional expression inextricable from human communication. This thesis aims at exploring new system control and expressive modalities for people with LIS. Firstly, existing gaze-based web-browsing interfaces were investigated. Page analysis and high mental workload appeared as recurring issues with common systems. To address this issue, a novel user interface was designed and evaluated against a commercial system. The results suggested that it is easier to learn and to use, quicker, more satisfying, less frustrating, less tiring and less prone to error. Mental workload was greatly diminished with this system. Other types of system control for LIS patients were then investigated. It was found that galvanic skin response may be used as system input and that stress related bio-feedback helped lowering mental workload during stressful tasks. Improving communication was one of the main goal of this research and in particular emotional communication. A system including a gaze-controlled emotional voice synthesis and a personal emotional avatar was developed with this purpose. Assessment of the proposed system highlighted the enhanced capability to have dialogs more similar to normal ones, to express and to identify emotions. Enabling emotion communication in parallel to sentences was found to help with the conversation. Automatic emotion detection seemed to be the next step toward improving emotional communication. Several studies established that physiological signals relate to emotions. The ability to use physiological signals sensors with LIS patients and their non-invasiveness made them an ideal candidate for this study. One of the main difficulties of emotion detection is the collection of high intensity affect-related data. Studies in this field are currently mostly limited to laboratory investigations, using laboratory-induced emotions, and are rarely adapted for real-life applications. A virtual reality emotion elicitation technique based on appraisal theories was proposed here in order to study physiological signals of high intensity emotions in a real-life-like environment. While this solution successfully elicited positive and negative emotions, it did not elicit the desired emotions for all subject and was therefore, not appropriate for the goals of this research. Collecting emotions in the wild appeared as the best methodology toward emotion detection for real-life applications. The state of the art in the field was therefore reviewed and assessed using a specifically designed method for evaluating datasets collected for emotion recognition in real-life applications. The proposed evaluation method provides guidelines for future researcher in the field. Based on the research findings, a mobile application was developed for physiological and emotional data collection in the wild. Based on appraisal theory, this application provides guidance to users to provide valuable emotion labelling and help them differentiate moods from emotions. A sample dataset collected using this application was compared to one collected using a paper-based preliminary study. The dataset collected using the mobile application was found to provide a more valuable dataset with data consistent with literature. This mobile application was used to create an open-source affect-related physiological signals database. While the path toward emotion detection usable in real-life application is still long, we hope that the tools provided to the research community will represent a step toward achieving this goal in the future. Automatically detecting emotion could not only be used for LIS patients to communicate but also for total-LIS patients who have lost their ability to move their eyes. Indeed, giving the ability to family and caregiver to visualize and therefore understand the patients' emotional state could greatly improve their quality of life. This research provided tools to LIS patients and the scientific community to improve augmentative and alternative communication, technologies with better interfaces, emotion expression capabilities and real-life emotion detection. Emotion recognition methods for real-life applications could not only enhance health care but also robotics, domotics and many other fields of study. A complete system fully gaze-controlled was made available open-source with all the developed solutions for LIS patients. This is expected to enhance their daily lives by improving their communication and by facilitating the development of novel assistive systems capabilities

    Design and Effect of Continuous Wearable Tactile Displays

    Get PDF
    Our sense of touch is one of our core senses and while not as information rich as sight and hearing, it tethers us to reality. Our skin is the largest sensory organ in our body and we rely on it so much that we don\u27t think about it most of the time. Tactile displays - with the exception of actuators for notifications on smartphones and smartwatches - are currently understudied and underused. Currently tactile cues are mostly used in smartphones and smartwatches to notify the user of an incoming call or text message. Specifically continuous displays - displays that do not just send one notification but stay active for an extended period of time and continuously communicate information - are rarely studied. This thesis aims at exploring the utilization of our vibration perception to create continuous tactile displays. Transmitting a continuous stream of tactile information to a user in a wearable format can help elevate tactile displays from being mostly used for notifications to becoming more like additional senses enabling us to perceive our environment in new ways. This work provides a serious step forward in design, effect and use of continuous tactile displays and their use in human-computer interaction. The main contributions include: Exploration of Continuous Wearable Tactile Interfaces This thesis explores continuous tactile displays in different contexts and with different types of tactile information systems. The use-cases were explored in various domains for tactile displays - Sports, Gaming and Business applications. The different types of continuous tactile displays feature one- or multidimensional tactile patterns, temporal patterns and discrete tactile patterns. Automatic Generation of Personalized Vibration Patterns In this thesis a novel approach of designing vibrotactile patterns without expert knowledge by leveraging evolutionary algorithms to create personalized vibration patterns - is described. This thesis presents the design of an evolutionary algorithm with a human centered design generating abstract vibration patterns. The evolutionary algorithm was tested in a user study which offered evidence that interactive generation of abstract vibration patterns is possible and generates diverse sets of vibration patterns that can be recognized with high accuracy. Passive Haptic Learning for Vibration Patterns Previous studies in passive haptic learning have shown surprisingly strong results for learning Morse Code. If these findings could be confirmed and generalized, it would mean that learning a new tactile alphabet could be made easier and learned in passing. Therefore this claim was investigated in this thesis and needed to be corrected and contextualized. A user study was conducted to study the effects of the interaction design and distraction tasks on the capability to learn stimulus-stimulus-associations with Passive Haptic Learning. This thesis presents evidence that Passive Haptic Learning of vibration patterns induces only a marginal learning effect and is not a feasible and efficient way to learn vibration patterns that include more than two vibrations. Influence of Reference Frames for Spatial Tactile Stimuli Designing wearable tactile stimuli that contain spatial information can be a challenge due to the natural body movement of the wearer. An important consideration therefore is what reference frame to use for spatial cues. This thesis investigated allocentric versus egocentric reference frames on the wrist and compared them for induced cognitive load, reaction time and accuracy in a user study. This thesis presents evidence that using an allocentric reference frame drastically lowers cognitive load and slightly lowers reaction time while keeping the same accuracy as an egocentric reference frame, making a strong case for the utilization of allocentric reference frames in tactile bracelets with several tactile actuators
    corecore