9 research outputs found

    Multimodal feedback for mid-air gestures when driving

    Get PDF
    Mid-air gestures in cars are being used by an increasing number of drivers on the road. Us-ability concerns mean good feedback is important, but a balance needs to be found between supporting interaction and reducing distraction in an already demanding environment. Visual feedback is most commonly used, but takes visual attention away from driving. This thesis investigates novel non-visual alternatives to support the driver during mid-air gesture interaction: Cutaneous Push, Peripheral Lights, and Ultrasound feedback. These modalities lack the expressive capabilities of high resolution screens, but are intended to allow drivers to focus on the driving task. A new form of haptic feedback — Cutaneous Push — was defined. Six solenoids were embedded along the rim of the steering wheel, creating three bumps under each palm. Studies 1, 2, and 3 investigated the efficacy of novel static and dynamic Cutaneous Push patterns, and their impact on driving performance. In simulated driving studies, the cutaneous patterns were tested. The results showed pattern identification rates of up to 81.3% for static patterns and 73.5% for dynamic patterns and 100% recognition of directional cues. Cutaneous Push notifications did not impact driving behaviour nor workload and showed very high user acceptance. Cutaneous Push patterns have the potential to make driving safer by providing non-visual and instantaneous messages, for example to indicate an approaching cyclist or obstacle. Studies 4 & 5 looked at novel uni- and bimodal feedback combinations of Visual, Auditory, Cutaneous Push, and Peripheral Lights for mid-air gestures and found that non-visual feedback modalities, especially when combined bimodally, offered just as much support for interaction without negatively affecting driving performance, visual attention and cognitive demand. These results provide compelling support for using non-visual feedback from in-car systems, supporting input whilst letting drivers focus on driving.Studies 6 & 7 investigated the above bimodal combinations as well as uni- and bimodal Ultrasound feedback during the Lane Change Task to assess the impact of gesturing and feedback modality on car control during more challenging driving. The results of study Seven suggests that Visual and Ultrasound feedback are not appropriate for in-car usage,unless combined multimodally. If Ultrasound is used unimodally it is more useful in a binary scenario.Findings from Studies 5, 6, and 7 suggest that multimodal feedback significantly reduces eyes-off-the-road time compared to Visual feedback without compromising driving performance or perceived user workload, thus it can potentially reduce crash risks. Novel design recommendations for providing feedback during mid-air gesture interaction in cars are provided, informed by the experiment findings

    Scale-Score: Investigation of a Meta yet Multi-level Label to Support Nutritious and Sustainable Food Choices When Online Grocery Shopping

    Full text link
    Food consumption is one of the biggest contributors to climate change. However, online grocery shoppers often lack the time, motivation, or knowledge to contemplate a food's environmental impact. At the same time, they are concerned with their own well-being. To empower grocery shoppers in making nutritionally and environmentally informed decisions, we investigate the efficacy of the Scale-Score, a label combining nutritional and environmental information to highlight a product's benefit to both the consumer's and the planet's health, without obscuring either information. We conducted an online survey to understand user needs and requirements regarding a joint food label, we developed an open-source mock online grocery environment, and assessed label efficacy. We find that the Scale-Score supports nutritious purchases, yet needs improving regarding sustainability support. Our research shows first insights into design considerations and performance of a combined yet disjoint food label, potentially altering the label design space.Comment: Work in progress. arXiv admin note: text overlap with arXiv:2309.0323

    Bimodal Feedback for In-car Mid-air Gesture Interaction

    Get PDF
    This demonstration showcases novel multimodal feedback designs for in-car mid-air gesture interaction. It explores the potential of multimodal feedback types for mid-air gestures in cars and how these can reduce eyes-off-the-road time thus make driving safer. We will show four different bimodal feedback combinations to provide effective information about interaction with systems in a car. These feedback techniques are visual-auditory, auditory-ambient (peripheral vision), ambient-tactile, and tactile-auditory. Users can interact with the system after a short introduction, creating an exciting opportunity to deploy these displays in cars in the future

    Evaluation of Haptic Patterns on a Steering Wheel

    Get PDF
    Infotainment Systems can increase mental workload and divert visual attention away from looking ahead on the roads. When these systems give information to the driver, provide it through the tactile channel on the steering, it wheel might improve driving behaviour and safety. This paper describes an investigation into the perceivability of haptic feedback patterns using an actuated surface on a steering wheel. Six solenoids were embedded along the rim of the steering wheel creating three bumps under each palm. Maximally, four of the six solenoids were actuated simultaneously, resulting in 56 patterns to test. Participants were asked to keep in the middle road of the driving simulator as good as possible. Overall recognition accuracy of the haptic patterns was 81.3%, where identification rate increased with decreasing number of active solenoids (up to 92.2% for a single solenoid). There was no significant increase in lane deviation or steering angle during haptic pattern presentation. These results suggest that drivers can reliably distinguish between cutaneous patterns presented on the steering wheel. Our findings can assist in delivering non-critical messages to the driver (e.g. driving performance, incoming text messages, etc.) without decreasing driving performance or increasing perceived mental workload

    Envirofy your Shop: Development of a Real-time Tool to Support Eco-friendly Food Purchases Online

    Get PDF
    A third of global greenhouse gas (GHG) emissions are attributable to the food sector, however dietary change could reduce this by 49%. Many people intend to make eco-friendly food choices, but fail to do so at the point-of-purchase. Educating consumers on the environmental impact of their choices during their shop may be a powerful approach to tackling climate change. This paper presents the theory- and evidence-based development of Envirofy: the first eco-friendly e-commerce grocery tool for real shoppers. We share how we used the Behaviour Change Wheel (BCW) and multidisciplinary evidence to maximise the likely effectiveness of Envirofy. We conclude with a discussion of how the HCI community can help to develop and evaluate real-time tools to close intention-behaviour gaps and ultimately reduce GHG emissions

    Evaluating Haptic Feedback on a Steering Wheel in a Simulated Driving Scenario

    No full text
    This paper investigates how perceivable haptic feedback patterns are using an actuated surface on a steering wheel. Six solenoids were embedded along the surface of the wheel, creating three bumps under each palm. The solenoids can be used to create a range of different tactile patterns. As a result of the design recommendation by Gallace et al. [Gallace2006a] maximally four of the six solenoids were actuated simultaneously, resulting in 57 patterns to test. A simulated driving study was conducted to investigate (1) the optimal number of actuated solenoids and (2) the most perceivable haptic patterns. A relationship between number of actuated solenoids and pattern identification rate was established. Perception accuracy drops above three active solenoids. Haptic patterns mirrored symmetrically on both hands were perceived more accurately. Practical applications for displaying tactile messages on the steering wheel are e.g. dead angles, upcoming road conditions, navigation information (i.e. conveying information discretely to the driver)

    Eco-Joy:Imagining Sustainable and Joyful Food Eco-label Futures

    No full text
    A third of global greenhouse gas (GHG) emissions are attributable to the food sector, however dietary change could reduce this by half. Educating consumers on the environmental impact of their choices through eco-labels as a form of sustainability signalling may be a powerful approach to tackling climate change if it can bring about a large scale transition in households and supply chains. When designing interactive systems and applications integrating eco-labels however, we need to take into account the different barriers that exist (e.g. lack of knowledge, climate anxiety, complexity of food systems, affordability). The aim of this workshop is to imagine the future of eco-labels and sustainability signalling in ways that are both effective and joyful. The workshop invites researchers and practitioners to discuss current eco-labels, and creatively design the future of sustainability signalling with emerging technology
    corecore