12 research outputs found

    User experience comparison among touchless, haptic and voice Head-Up Displays interfaces in automobiles

    Get PDF
    This paper evaluates driving experiences when using a Head-Up Display (HUD) system by different interaction methods, since HUDs in automotive market do not follow standard interaction methods (Betancur et al. in Int J Interact Des Manuf 12(1):199–214, [3]), and therefore it is difficult to identify what types of these methods are the most appropriated in terms of usability for being implemented in HUDs. This paper focuses on comparing the mental workload that a driver could experience while interacting with a specific HUD visual interface by touchless gestures, haptic or voice methods. Then, test subjects (n = 15) performed conjointly a driving activity and a set of in-vehicle tasks by using these interaction methods throughout a HUD. The experiments show that the haptic method was better accepted than the touchless gestures and voice methods. Nevertheless, the touchless gestures method was explored under some specific usability configurations, in order to demonstrate that it was not significantly different from the haptic method.This study has been financially supported by the Pontificia Universidad Javeriana (Bogota, Colombia). Project: “Análisis del comportamiento del conductor cuando se utilizan sistemas automotrices Head-Up Display”, ID 07228

    Tactile and Crossmodal Change Blindness and its Implications for Display Design.

    Full text link
    Data overload, especially in the visual channel, and associated breakdowns in monitoring already represent a major challenge in data-rich environments. One promising means of overcoming data overload is through the introduction of multimodal displays, i.e., displays which distribute information across various sensory channels (including vision, audition, and touch). This approach has been shown to be effective in offloading the overburdened visual channel and thus reduce data overload. However, the effectiveness of these displays may be compromised if their design does not take into consideration limitations of human perception and cognition. One important question is the extent to which the tactile modality is susceptible to change blindness. Change blindness refers to the failure to detect even large and expected changes when these changes coincide with a “transient” stimulus. To date, the phenomenon has been studied primarily in vision, but there is limited empirical evidence that the tactile modality may also be subject to change blindness. If confirmed, this raises concerns about the robustness of multimodal displays and their use. A series of research activities described in this dissertation sought to answer the following questions: (1) to what extent, and under what circumstances, is the sense of touch susceptible to change blindness, (2) does change blindness occur crossmodally between vision and touch, and (3) how effective are three different display types for overcoming these phenomena. The effect of transient type, transient duration, and task demands were also investigated in the context of Unmanned Aerial Vehicle (UAV) control, the selected domain of application. The findings confirmed the occurrence of intramodal tactile change blindness, but not crossmodal change blindness. Subsequently, three countermeasures to intramodal tactile change blindness were developed and evaluated. The design of these countermeasures focused on supporting four of the five steps required for change detection and was found to significantly improve performance compared to when there was no countermeasure in place. Overall, this research adds to the knowledge base in multimodal and redundant information processing and can inform the design of multimodal displays not only for UAV control, but also other complex, data-rich domains.PhDIndustrial & Operations EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/108870/1/salu_1.pd

    Advances in Human-Robot Interaction

    Get PDF
    Rapid advances in the field of robotics have made it possible to use robots not just in industrial automation but also in entertainment, rehabilitation, and home service. Since robots will likely affect many aspects of human existence, fundamental questions of human-robot interaction must be formulated and, if at all possible, resolved. Some of these questions are addressed in this collection of papers by leading HRI researchers

    Proceedings of the Human Factors and Ergonomics Society Europe Chapter 2022 Annual Conference

    Get PDF
    corecore