2 research outputs found
The Effect of Luminance on Depth Perception in Augmented Reality Guided Laparoscopic Surgery
Depth perception is a major issue in surgical augmented reality (AR) with limited research
conducted in this scientific area. This study establishes a relationship between luminance and
depth perception. This can be used to improve visualisation design for AR overlay in laparoscopic
surgery, providing surgeons a more accurate perception of the anatomy intraoperatively. Two
experiments were conducted to determine this relationship. First, an online study with 59
participants from the general public, and second, an in-person study with 10 surgeons as
participants. We developed 2 open-source software tools utilising SciKit-Surgery libraries to
enable these studies and any future research. Our findings demonstrate that the higher the relative
luminance, the closer a structure is perceived to the operating camera. Furthermore, the higher
the luminance contrast between the two structures, the higher the depth distance perceived. The
quantitative results from both experiments are in agreement, indicating that online recruitment of
the general public can be helpful in similar studies. An observation made by the surgeons from
the in-person study was that the light source used in laparoscopic surgery plays a role in depth
perception. This is due to its varying positioning and brightness which could affect the perception
of the overlaid AR. We found that luminance directly correlates with depth perception for both
surgeons and the general public, regardless of other depth cues. Future research may focus on
comparing different colours used in surgical AR and using a mock operating room (OR) with
varying light sources and positions
The Effects of Object Shape, Fidelity, Color, and Luminance on Depth Perception in Handheld Mobile Augmented Reality
Depth perception of objects can greatly affect a user's experience of an
augmented reality (AR) application. Many AR applications require depth matching
of real and virtual objects and have the possibility to be influenced by depth
cues. Color and luminance are depth cues that have been traditionally studied
in two-dimensional (2D) objects. However, there is little research
investigating how the properties of three-dimensional (3D) virtual objects
interact with color and luminance to affect depth perception, despite the
substantial use of 3D objects in visual applications. In this paper, we present
the results of a paired comparison experiment that investigates the effects of
object shape, fidelity, color, and luminance on depth perception of 3D objects
in handheld mobile AR. The results of our study indicate that bright colors are
perceived as nearer than dark colors for a high-fidelity, simple 3D object,
regardless of hue. Additionally, bright red is perceived as nearer than any
other color. These effects were not observed for a low-fidelity version of the
simple object or for a more-complex 3D object. High-fidelity objects had more
perceptual differences than low-fidelity objects, indicating that fidelity
interacts with color and luminance to affect depth perception. These findings
reveal how the properties of 3D models influence the effects of color and
luminance on depth perception in handheld mobile AR and can help developers
select colors for their applications.Comment: 9 pages, In proceedings of IEEE International Symposium on Mixed and
Augmented Reality (ISMAR) 202