6 research outputs found

    Editorial: Presence and beyond: Evaluating user experience in AR/MR/VR

    Get PDF
    The call for this Research Topic was intentionally broad: We sought papers that identify or propose constructs that can be used to describe AR/MR/VR, and papers that evaluate the utility of those constructs; we sought papers that discussed measures relating to user experience in AR/MR/VR - including, but not limited to, presence. In the end, we were very happy to publish fifteen articles addressing a variety of these questions - but, notably, not all of them. In the remainder of this editorial, we briefly introduce each of the fifteen articles, loosely grouping them into relevant categories. We then discuss each of the three categories in turn, and close with a call to action for our AR/MR/VR research community to more actively engage with human-computer interaction (HCI) and user experience (UX) researchers

    Isolating the Effect of Off-Road Glance Duration on Driving Performance: An Exemplar Study Comparing HDD and HUD in Different Driving Scenarios

    Get PDF
    Objective: We controlled participants’ glance behavior while using head-down displays (HDDs) and head-up displays (HUDs) to isolate driving behavioral changes due to use of different display types across different driving environments. Background: Recently, HUD technology has been incorporated into vehicles, allowing drivers to, in theory, gather display information without moving their eyes away from the road. Previous studies comparing the impact of HUDs with traditional displays on human performance show differences in both drivers’ visual attention and driving performance. Yet no studies have isolated glance from driving behaviors, which limits our ability to understand the cause of these differences and resulting impact on display design. Method: We developed a novel method to control visual attention in a driving simulator. Twenty experienced drivers sustained visual attention to in-vehicle HDDs and HUDs while driving in both a simple straight and empty roadway environment and a more realistic driving environment that included traffic and turns. Results: In the realistic environment, but not the simpler environment, we found evidence of differing driving behaviors between display conditions, even though participants’ glance behavior was similar. Conclusion: Thus, the assumption that visual attention can be evaluated in the same way for different types of vehicle displays may be inaccurate. Differences between driving environments bring the validity of testing HUDs using simplistic driving environments into question. Application: As we move toward the integration of HUD user interfaces into vehicles, it is important that we develop new, sensitive assessment methods to ensure HUD interfaces are indeed safe for driving

    A Perceptual Color-Matching Method for Examining Color Blending in Augmented Reality Head-Up Display Graphics

    Get PDF
    Augmented reality (AR) offers new ways to visualize information on-the-go. As noted in related work, AR graphics presented via optical see-through AR displays are particularly prone to color blending, whereby intended graphic colors may be perceptually altered by real-world backgrounds, ultimately degrading usability. This work adds to this body of knowledge by presenting a methodology for assessing AR interface color robustness, as quantitatively measured via shifts in the CIE color space, and qualitatively assessed in terms of users’ perceived color name. We conducted a human factors study where twelve participants examined eight AR colors atop three real-world backgrounds as viewed through an in-vehicle AR head-up display (HUD); a type of optical see-through display used to project driving-related information atop the forward-looking road scene. Participants completed visual search tasks, matched the perceived AR HUD color against the WCS color palette, and verbally named the perceived color. We present analysis that suggests blue, green, and yellow AR colors are relatively robust, while red and brown are not, and discuss the impact of chromaticity shift and dispersion on outdoor AR interface design. While this work presents a case study in transportation, the methodology is applicable to a wide range of AR displays in many application domains and settings

    The effects of augmented reality head-up displays on drivers' eye scan patterns, performance, and perceptions

    No full text
    This paper reports on an experiment comparing Head-Up Display (HUD) and Head-Down Display (HDD) use while driving in a simulator to explore differences in glance patterns, driving performance, and user preferences. Sixteen participants completed both structured (text) and semi-structured (grid) visual search tasks on each display while following a lead vehicle in a motorway (highway) environment. Participants experienced three levels of complexity (low, medium, high) for each visual search task, with five repetitions of each level of complexity. Results suggest that the grid task was not sensitive enough to the varying visual demands, while the text task showed significant differences between displays in user preference, perceived workload, and distraction. As complexity increased, HUD use during the text task corresponded with faster performance as compared to the HDD, indicating the potential benefits when using HUDs in the driving context. Furthermore, HUD use was associated with longer sustained glances (at the respective display) as compared to the HDD, with no differences in driving performance observed. This finding suggests that AR HUDs afford longer glances without negatively affecting the longitudinal and lateral control of the vehicle – a result that has implications for how future researchers should evaluate the visual demands for AR HUDs

    Augmented Mirrors: Depth Judgments When Augmenting Video Displays to Replace Automotive Mirrors

    No full text
    This study investigates the effects of Augmented Reality (AR) graphics on a drivers’ distance estimation and depth perception when using a video-based, AR-enhanced driver’s side mirror. Sixteen participants took part in the study, eight in a driving simulator and eight outside in a stationary vehicle. Participants experienced three different AR display image conditions, three different glance patterns, three different target vehicle speeds, and two own-vehicle image conditions. Distance data and confidence data were collected for each particpant and analyzed for any correlation between the conditions and performance. The results suggest that various AR images affected depth judgements and confidence levels. In addition, the vehicle speed and glance pattern of the videos also had significant effects

    Determining the impact of augmented reality graphic spatial location and motion on driver behaviors

    No full text
    While researchers have explored benefits of adding augmented reality graphics to vehicle displays, the impact of graphic characteristics have not been well researched. In this paper, we consider the impact of augmented reality graphic spatial location and motion, as well as turn direction, traffic presence, and gender, on participant driving and glance behavior and preferences. Twenty-two participants navigated through a simulated environment while using four different graphics. We employed a novel glance allocation analysis to differentiate information likely gathered with each glace with more granularity. Fixed graphics generally resulted in less visual attention and more time scanning for hazards than animated graphics. Finally, the screen-fixed graphic was preferred by participants over all world-relative graphics, suggesting that graphic spatially integration into the world may not always be necessary in visually complex urban environments like those considered in this study
    corecore