Integrated optical phased arrays for three-dimensional display applications

Abstract

This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2020Cataloged from student-submitted PDF of thesis.Includes bibliographical references (pages 154-164).The compatibility of silicon photonic platforms with complementary metal-oxide-semiconductor (CMOS) fabrication processes has facilitated a surge in the development of silicon-based integrated optical phased arrays (OPAs) for light detection and ranging (LiDAR) and free-space communications. However, silicon is limited to operating at infrared wavelengths since its bandgap prevents visible light transmission. The development of integrated OPAs for arbitrary complex wavefront synthesis in the visible spectrum would enable the expansion of this technology into a multitude of new applications spaces such as optical trapping, imaging through scattering media, underwater LiDAR, optogenetic stimulation, and three-dimensional (3D) displays. Silicon nitride, a CMOS-compatible material that is transparent in the visible spectrum, may be used as the waveguiding material in phased array systems designed for the above applications.In this work, we develop large-scale visible light integrated OPA systems fabricated in a silicon-nitride-based platform for 3D display applications. We begin by presenting the first demonstrations of visible light integrated OPAs. Building on this, we demonstrate a chip-scale architecture for autostereoscopic image projection using a system of multiple integrated OPAs to reconstruct virtual light fields. Specically, we generate a static virtual 3D image with horizontal parallax and a viewing angle of 5. Next, we present an architecture for realizing a transparent near-eye direct-view augmented/mixed reality (AR/MR) display using a system of integrated OPAs to directly project holographic images onto the user's retina. This display architecture was developed to address the deficiencies in current AR/MR headsets with respect to brightness, field of view (FOV), and the vergence-accommodation conflict, which causes eye fatigue.Here, we present a passive demonstration of the display as well as a number of key photonic components required to realize a system for 3D video.by Manan Raval.Ph. D.Ph.D. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Scienc

    Similar works

    Full text

    thumbnail-image

    Available Versions