15 research outputs found

    Integrated Multi-view 3D Image Capture and Motion Parallax 3D Display System

    Get PDF
    We propose an integrated 3D image capture and display system using a transversely moving camera, regular 2D display screen and user tracking that can facilitate the multi-view capture of a real scene or object and display the captured perspective views in 3D. The motion parallax 3D technique is used to capture the depth information of the object and display the corresponding views to the user using head tracking. The system is composed of two parts, the first part consists of a horizontally moving camera interfaced with a customized camera control and capture application. The second part consist of a regular LCD screen combined with web camera and user tracking application. The 3D multi-view images captured through the imaging setup are relayed to the display based on the user location and corresponding view is dynamically displayed on the screen based on the viewing angle of the user with respect to the screen. The developed prototype system provides the multi-view capture of 60 views with the step size of 1 cm and greater than 40˚ field-of-view overlap. The display system relays 60 views providing the viewing angle coverage of ±35˚ where the angular difference between two views is 1.2˚

    Design and Development of a Multi-Sided Tabletop Augmented Reality 3D Display Coupled with Remote 3D Imaging Module

    Get PDF
    This paper proposes a tabletop augmented reality (AR) 3D display paired with a remote 3D image capture setup that can provide three-dimensional AR visualization of remote objects or persons in real-time. The front-side view is presented in stereo-3D format, while the left-side and right-side views are visualized in 2D format. Transparent glass surfaces are used to demonstrate the volumetric 3D augmentation of the captured object. The developed AR display prototype mainly consists of four 40 × 30 cm2 LCD panels, 54% partially reflective glass, an in-house developed housing assembly, and a processing unit. The capture setup consists of four 720p cameras to capture the front-side stereo view and both the left- and right-side views. The real-time remote operation is demonstrated by connecting the display and imaging units through the Internet. Various system characteristics, such as range of viewing angle, stereo crosstalk, polarization perseverance, frame rate, and amount of reflected and transmitted light through partially reflective glass, were examined. The demonstrated system provided 35% optical transparency and less than 4% stereo crosstalk within a viewing angle of ±20 degrees. An average frame rate of 7.5 frames per second was achieved when the resolution per view was 240 × 240 pixels

    Design and Development of a Multi-Sided Tabletop Augmented Reality 3D Display Coupled with Remote 3D Imaging Module

    Get PDF
    This paper proposes a tabletop augmented reality (AR) 3D display paired with a remote 3D image capture setup that can provide three-dimensional AR visualization of remote objects or persons in real-time. The front-side view is presented in stereo-3D format, while the left-side and right-side views are visualized in 2D format. Transparent glass surfaces are used to demonstrate the volumetric 3D augmentation of the captured object. The developed AR display prototype mainly consists of four 40 × 30 cm2 LCD panels, 54% partially reflective glass, an in-house developed housing assembly, and a processing unit. The capture setup consists of four 720p cameras to capture the front-side stereo view and both the left- and right-side views. The real-time remote operation is demonstrated by connecting the display and imaging units through the Internet. Various system characteristics, such as range of viewing angle, stereo crosstalk, polarization perseverance, frame rate, and amount of reflected and transmitted light through partially reflective glass, were examined. The demonstrated system provided 35% optical transparency and less than 4% stereo crosstalk within a viewing angle of ±20 degrees. An average frame rate of 7.5 frames per second was achieved when the resolution per view was 240 × 240 pixels

    Large-scale Huygens metasurfaces for holographic 3D near-eye displays

    Full text link
    Novel display technologies aim at providing the users with increasingly immersive experiences. In this regard, it is a long-sought dream to generate three-dimensional (3D) scenes with high resolution and continuous depth, which can be overlaid with the real world. Current attempts to do so, however, fail in providing either truly 3D information, or a large viewing area and angle, strongly limiting the user immersion. Here, we report a proof-of-concept solution for this problem, and realize a compact holographic 3D near-eye display with a large exit pupil of 10mm x 8.66mm. The 3D image is generated from a highly transparent Huygens metasurface hologram with large (>10^8) pixel count and subwavelength pixels, fabricated via deep-ultraviolet immersion photolithography on 300 mm glass wafers. We experimentally demonstrate high quality virtual 3D scenes with ~50k active data points and continuous depth ranging from 0.5m to 2m, overlaid with the real world and easily viewed by naked eye. To do so, we introduce a new design method for holographic near-eye displays that, inherently, is able to provide both parallax and accommodation cues, fundamentally solving the vergence-accommodation conflict that exists in current commercial 3D displays.Comment: 21 pages, 9 figure

    A micromirror array with annular partitioning for high-speed random-access axial focusing

    Full text link
    Dynamic axial focusing functionality has recently experienced widespread incorporation in microscopy, augmented/virtual reality (AR/VR), adaptive optics, and material processing. However, the limitations of existing varifocal tools continue to beset the performance capabilities and operating overhead of the optical systems that mobilize such functionality. The varifocal tools that are the least burdensome to drive (ex: liquid crystal, elastomeric or optofluidic lenses) suffer from low (~ 100 Hz) refresh rates. Conversely, the fastest devices sacrifice either critical capabilities such as their dwelling capacity (ex: acoustic gradient lenses or monolithic micromechanical mirrors) or low operating overhead (e.g., deformable mirrors). Here, we present a general-purpose random-access axial focusing device that bridges these previously conflicting features of high speed, dwelling capacity and lightweight drive by employing low-rigidity micromirrors that exploit the robustness of defocusing phase profiles. Geometrically, the device consists of an 8.2 mm diameter array of piston-motion and 48 um-pitch micromirror pixels that provide 2pi phase shifting for wavelengths shorter than 1 100 nm with 10-90 % settling in 64.8 us (i.e., 15.44 kHz refresh rate). The pixels are electrically partitioned into 32 rings for a driving scheme that enables phase-wrapped operation with circular symmetry and requires less than 30 V per channel. Optical experiments demonstrated the array's wide focusing range with a measured ability to target 29 distinct, resolvable depth planes. Overall, the features of the proposed array offer the potential for compact, straightforward methods of tackling bottlenecked applications including high-throughput single-cell targeting in neurobiology and the delivery of dense 3D visual information in AR/VR.Comment: 38 pages, 8 figure

    Optical Gaze Tracking with Spatially-Sparse Single-Pixel Detectors

    Get PDF
    Gaze tracking is an essential component of next generation displays for virtual reality and augmented reality applications. Traditional camera-based gaze trackers used in next generation displays are known to be lacking in one or multiple of the following metrics: power consumption, cost, computational complexity, estimation accuracy, latency, and form-factor. We propose the use of discrete photodiodes and light-emitting diodes (LEDs) as an alternative to traditional camera-based gaze tracking approaches while taking all of these metrics into consideration. We begin by developing a rendering-based simulation framework for understanding the relationship between light sources and a virtual model eyeball. Findings from this framework are used for the placement of LEDs and photodiodes. Our first prototype uses a neural network to obtain an average error rate of 2.67{\deg} at 400Hz while demanding only 16mW. By simplifying the implementation to using only LEDs, duplexed as light transceivers, and more minimal machine learning model, namely a light-weight supervised Gaussian process regression algorithm, we show that our second prototype is capable of an average error rate of 1.57{\deg} at 250 Hz using 800 mW.Comment: 10 pages, 8 figures, published in IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 202

    Optimization of Computer generated holography rendering and optical design for a compact and large eyebox Augmented Reality glass

    Get PDF
    Thesis (Master of Science in Informatics)--University of Tsukuba, no. 41288, 2019.3.2

    Light-field head-mounted displays reduce the visual effort: A user study

    Get PDF
    Head-mounted displays (HMD) allow the visualization of virtual content and the change of view perspectives in a virtual reality (VR). Besides entertainment purposes, such displays also find application in augmented reality, VR training or tele-robotic systems. The quality of visual feedback plays a key role for the interaction performance in such setups. In the last years, high-end computers and displays led to the reduction of simulator sickness regarding nausea symptoms, while new visualization technologies are required to further reduce oculomotor and disorientation symptoms. The so-called vergence-accommodation conflict (VAC) in standard stereoscopic displays prevents intense use of 3D displays, so far. The VAC describes the visual mismatch between the projected stereoscopic 3D image and the optical distance to the HMD screen. This conflict can be solved by using displays with correct focal distance. The light-field HMD of this study provides a close-to-continuous depth and high image resolution enabling a highly natural visualization. This paper presents the first user-study on the visual comfort of light-field displays with a close-to-market HMD based on complex interaction tasks. The results provide first evidence that the light-field technology brings clear benefits to the user in terms of physical use comfort, workload and depth matching performance
    corecore