61 research outputs found

    Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction

    Get PDF
    This paper presents an investigation into the effects of different feedback modalities on mid-air gesture interaction for infotainment systems in cars. Car crashes and near-crash events are most commonly caused by driver distraction. Mid-air interaction is a way of reducing driver distraction by reducing visual demand from infotainment. Despite a range of available modalities, feedback in mid-air gesture systems is generally provided through visual displays. We conducted a simulated driving study to investigate how different types of multimodal feedback can support in-air gestures. The effects of different feedback modalities on eye gaze behaviour, and the driving and gesturing tasks are considered. We found that feedback modality influenced gesturing behaviour. However, drivers corrected falsely executed gestures more often in non-visual conditions. Our findings show that non-visual feedback can reduce visual distraction significantl

    The Role and Potentials of Field User Interaction Data in the Automotive UX Development Lifecycle: An Industry Perspective

    Full text link
    We are interested in the role of field user interaction data in the development of IVIS, the potentials practitioners see in analyzing this data, the concerns they share, and how this compares to companies with digital products. We conducted interviews with 14 UX professionals, 8 from automotive and 6 from digital companies, and analyzed the results by emergent thematic coding. Our key findings indicate that implicit feedback through field user interaction data is currently not evident in the automotive UX development process. Most decisions regarding the design of IVIS are made based on personal preferences and the intuitions of stakeholders. However, the interviewees also indicated that user interaction data has the potential to lower the influence of guesswork and assumptions in the UX design process and can help to make the UX development lifecycle more evidence-based and user-centered

    Accessible Automated Automotive Workshop Series (A3WS): International Perspective on Inclusive External Human-Machine Interfaces

    Get PDF
    The fact that automated vehicles will be part of road traffic raises the question of how human road users, like bicyclists or pedestrians, would safely interact with them. Research has proposed external human-machine interfaces (eHMIs) for automated vehicles as a potential solution. Concept prototypes and evaluations so far have mainly focused on young, healthy adults and people without disabilities, such as visual impairments. For a “one-for-all” holistic, inclusive solution, however, further target groups like children, seniors, or people with (other) special needs will have to be considered. In this workshop, we bring together researchers, experts, and practitioners working on eHMIs to broaden our perspective on inclusiveness. We aim to identify aspects of inclusive eHMI design that can be universal and tailored to any culture and will focus on discussing methods, tools, and scenarios for inclusive communication
    • 

    corecore