23 research outputs found

    Out-of-plane focusing grating couplers for silicon photonics integration with optical MRAM technology

    Get PDF
    We present the design methodology and experimental characterization of compact out-of-plane focusing grating couplers for integration with magnetoresistive random access memory technology. Focusing grating couplers have recently found attention as layer-couplers for photonic-electronic integration. The components we demonstrate are designed for a wavelength of 1550 nm, fabricated in a standard 220 nm SOI photonic platform and optimized given the fabrication restrictions for standard 193-nm UV lithography. For the first time, we extend the design based on the phase matching condition to a two-dimensional (2-D) grating design with two optical input ports. We further present the experimental characterization of the focusing behaviour by spatially probing the emitted beam with a tapered-and-lensed fiber and demonstrate the polarization controlling capabilities of the 2-D FGCs

    Optical studies of photonic crystals and high index-contrast microphotonic circuits

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Physics, 2006.Includes bibliographical references (p. 137-143).Both high index-contrast (HIC) photonic crystals and HIC microphotonic circuits are presented in this thesis. Studies of macro-scale 2D photonic crystal meta-materials are first described. Through comparison of experimental and theoretical beam evolution about the super-collimation frequencies, the effects of disorder on beam evolution are pinpointed. Despite the effects of disorder, super-collimation is found to be robust, producing stationary beam-widths over 600 isotropic diffraction-lengths. In addition, nano-scale photonic crystal defect modes are studied over large optical bandwidths through newly developed supercontinuum based techniques. Novel all-fiber supercontinuum sources facilitate the generation of unpolarized supercontinuum light over 1.2-2.0 micron wavelengths. Broadband experimental methods make possible the application of these sources to the study of 1D and 3D photonic crystals with defect states. Studies of both static and dynamic microring resonator based HIC filters are described. Numerous microring based studies are reported which lead to frequency-compensated multi-ring filters, permitting the first high-fidelity microring filters in HIC microphotonics.(cont.) Though telecom-grade performance achieved via frequency compensation, the aforementioned filters exhibit severe polarization sensitivities, making them incompatible for real-world applications. Through integration of identical sets of these filters in a generalized polarization diversity scheme, polarization insensitive HIC filters are demonstrated for the first time, yielding a maximum polarization dependant loss of 2.2 dB over broad bandwidths. Finally, evanescent field-perturbation is investigated as a means of tuning microcavities over ultrawide wavelength ranges. Through nano-metric control of a silica perturbing body in the near-field of a microring waveguide, a 27 nm (or 1.7%) reversible tuning of its cavity mode is achieved.by Peter Thomas Rakich.Ph.D

    Interleavers

    Get PDF
    The chapter describes principles, analysis, design, properties, and implementations of optical frequency (or wavelength) interleavers. The emphasis is on finite impulse response devices based on cascaded Mach-Zehnder-type filter elements with carefully designed coupling ratios, the so-called resonant couplers. Another important class that is discussed is the infinite impulse response type, based on e.g. Fabry-Perot, Gires-Tournois, or ring resonators

    Novel structures and applications of leaky thin-ridge silicon waveguides

    Get PDF
    The ability to utilize signals at optical frequencies, as opposed to say microwave frequencies, provides much more bandwidth and signal transmission speed to meet the increasing telecommunication demands in today's world. The ability to integrate optical circuits in the same manner as in electronic integrated circuits means that optical devices can be miniaturized and can even complement today's complex electronic circuits and devices. Silicon nanophotonics is a highly attractive platform for emerging integrated optical solutions in areas including optical signal transmission, signal processing, optical sensing and optical computing. This is primarily because the silicon platform is compatible with CMOS fabrication processes, which through significant investment have developed and matured over many years to serve the electronics industry. Transitioning into an optical platform that can exploit this vast electronics manufacturing industry is viable particularly for enabling low cost mass manufacturing of integrated photonic circuits. High refractive index contrast silicon waveguide platforms such as silicon-on-insulator (SOI) enable strong confinement of light in sub-micron waveguides as well as the sharp bending of waveguides with minimal loss. The SOI platform has therefore attracted research interest into the development of compact integrated silicon photonic circuits. Thin-ridge SOI waveguides are particularly promising because they minimize signal transmission loss by significantly reducing the waveguide etch-depth and therefore reducing scattering losses due to sidewall roughness. However, a consequence of the reduced etch-depth is the possibility for TM guided modes to couple to highly coherent TE radiation in the adjacent slab. This TM-TE coupling phenomenon, named lateral leakage radiation, is the subject of this thesis. The main aim of this thesis is to investigate the possible exploitation of this inherent TE-TM coupling relationship. The novel structures presented herein could have potential applications which include optical biosensing, polarization rotation and resonant optical filtering. The main contributions of this research work include first and foremost the discovery of a resonant TE-TM coupling effect in thin-ridge waveguides. This resonance effect has a canonical Lorentzian response and the quality-factor can be controlled by adjusting the waveguide dimensions. It is also shown that several such resonator waveguides can be cascaded in a coupled resonator topology to realize higher order Chebyshev filter responses. Another contribution in this thesis is that a holographic-based grating structure exploiting the TM-TE coupling in thin-ridge waveguides can be used to efficiently convert a Gaussian TE slab beam into a collimated TM slab beam. It is shown that an apodized grating is the most suitable design for achieving this goal. Lastly, it is also shown through simulation that the lateral leakage effect can be utilized as a biosensor to measure refractive index changes at the surface of a thin-ridge waveguide caused by the deposition of biomolecules. A tapered thin-ridge waveguide in tandem with a planar lens structure is proposed as a potential sensor topology for evanescent field biosensing. In summary, it has been shown that lateral leakage in thin ridge waveguides can be enhanced using unique waveguide structures and exploited for integrated optical applications

    2023 Astrophotonics Roadmap: pathways to realizing multi-functional integrated astrophotonic instruments

    Get PDF
    This is the final version. Available on open access from IOP Publishing via the DOI in this recordData availability statement: The data that support the findings of this study are available upon reasonable request from the authors.Photonic technologies offer numerous functionalities that can be used to realize astrophotonic instruments. The most spectacular example to date is the ESO Gravity instrument at the Very Large Telescope in Chile that combines the light-gathering power of four 8 m telescopes through a complex photonic interferometer. Fully integrated astrophotonic devices stand to offer critical advantages for instrument development, including extreme miniaturization when operating at the diffraction-limit, as well as integration, superior thermal and mechanical stabilization owing to the small footprint, and high replicability offering significant cost savings. Numerous astrophotonic technologies have been developed to address shortcomings of conventional instruments to date, including for example the development of photonic lanterns to convert from multimode inputs to single mode outputs, complex aperiodic fiber Bragg gratings to filter OH emission from the atmosphere, complex beam combiners to enable long baseline interferometry with for example, ESO Gravity, and laser frequency combs for high precision spectral calibration of spectrometers. Despite these successes, the facility implementation of photonic solutions in astronomical instrumentation is currently limited because of (1) low throughputs from coupling to fibers, coupling fibers to chips, propagation and bend losses, device losses, etc, (2) difficulties with scaling to large channel count devices needed for large bandwidths and high resolutions, and (3) efficient integration of photonics with detectors, to name a few. In this roadmap, we identify 24 key areas that need further development. We outline the challenges and advances needed across those areas covering design tools, simulation capabilities, fabrication processes, the need for entirely new components, integration and hybridization and the characterization of devices. To realize these advances the astrophotonics community will have to work cooperatively with industrial partners who have more advanced manufacturing capabilities. With the advances described herein, multi-functional integrated instruments will be realized leading to novel observing capabilities for both ground and space based platforms, enabling new scientific studies and discoveries.National Science Foundation (NSF)NAS

    Devices for satellite-assisted quantum networks

    Get PDF
    Quantum networks, quantum nodes interconnected by quantum channels, offer powerful means of secure communications and quantum computations. They are crucial elements in a broad area of quantum technologies including quantum simulations and metrologies. In particular, quantum links with satellites take the network into a global or greater scale, extending the capability of transmitting information. It also provides experimental platforms of testing quantum physics in a relativistic regime. The realization of satellite-assisted quantum networks requires devices that are interfaced with quantum optical channels to satellites. This thesis discusses the development of four essential devices, three of which are in line with Canada's Quantum Encryption and Science Satellite (QEYSSat) mission. First, polarization-entangled photon sources are developed to transmit one of the paired photons over ground-based fiber-optic networks and the other over ground-to-satellite free-space links. A practical and versatile interferometric scheme is designed and demonstrated, which allows constructing highly non-degenerate sources with only conventional polarization optics. A method of directly producing entangled photon-pairs from optical fibers without interferometers is studied with thorough numerical analysis to show feasibility of experimental demonstration. An entangled photon source for the QEYSSat mission is conceptually designed, and several key parameters to fulfill a set of performance requirements are theoretically studied and experimentally verified. Secondly, this thesis presents two characterization platforms for optical components that are designed and implemented for the QEYSSat mission. One is to precisely measure transmitted wavefronts of large optics including telescopes. A proof-of-principle experiment is conducted with accurate modelling of measurement apparatus via three-dimensional raytracing, and quantitative agreement between the experiment and simulations validates our methodology. The other provides polarization characterizations for a variety of optical components including lenses, mirrors, and telescopes with consistent precision. A detailed description of subsystems including calibrations and test procedures is provided. Polarization-test results of several components for the QEYSSat are discussed. Third, quantum frequency transducers are developed for single-photon quantum key distributions with QEYSSat links. The devices are designed to translate the wavelength of single-photons emitted from quantum dot single-photon sources to QEYSSat channel wavelength via four-wave mixing Bragg-scattering process. Two optical media are concerned: a silicon nitride ring resonator and a photonic crystal fiber. Thorough numerical simulations are performed to estimate the device performance for both cases. A proof-of-principle demonstration of the frequency translation is conducted with a commercial photonic crystal fiber. Finally, a quantum simulator, serving as a quantum node in satellite-assisted quantum networks, is designed in a silicon nitride nanophotonic platform with cesium atoms. The designed photonic structure tailors electromagnetic vacuum such that photon-mediated forces between atoms causes collective motions mediating site-selective SU(2) spin-spin interactions. A coherent spin-exchange rate between atoms and collective dissipation rate of atoms are precisely estimated via finite-element time domain simulations. Furthermore, two schemes of trapping atoms in the vicinity of the designed structure are studied with calculations of potential energies and phonon tunneling rates. Experimental progress toward realization of the proposed system is summarized. The presented research activities of designing, analyzing, and implementing devices demonstrates the readiness of satellite-assisted quantum networks. This work contributes to creating quantum channels by entanglements with interfaces of various quantum systems in line with a broader scope of establishing a global quantum internet and quantum space exploration

    Polarization independent microphotonic circuits

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.Includes bibliographical references (p. 165-170).Microphotonic circuits have been proposed for applications ranging from optical switching and routing to optical logic circuits. However many applications require microphotonic circuits to be polarization independent, a requirement that is difficult to achieve with the high index contrast waveguides needed to form microphotonic devices. Chief among these microphotonic circuits is the optical add/drop multiplexer which requires polarization independence to mate to the standard single-mode fiber forming today's optical networks. Herein, we present the results of an effort to circumvent the polarization dependence of a microphotonic add/drop multiplexer with an integrated polarization diversity scheme. Rather than attempt to overcome the polarization dependence of the microphotonic devices in the circuit directly, the arbitrary polarization emanating from the fiber is split into orthogonal components, one of which is rotated to enable a single on-chip polarization. The outputs are passed through identical sets of devices and recombined at the output through the reverse process.(cont.) While at the time of this publication the full polarization diversity scheme has yet to be implemented, the sub-components have demonstrated best-in-class performance, leaving integration as the remaining task. We present the results of a significant effort to design integrated polarization rotators, splitters, and splitter-rotators needed to implement the integrated polarization diversity scheme. Rigorous electromagnetic simulations were used to design these devices along with the microring-resonator based filters used to form the optical add/drop multiplexer microphotonic circuit. These device designs were passed onto fabrication, and the fabricated devices were characterized and the results compared to theoretical predictions. The integrated polarization rotators and splitters demonstrated broadband, low loss, and low cross-talk performance while the integrated polarization splitter-rotators demonstrated equally impressive performance and represent the first demonstrations of a device of this kind. Similarly impressive performance was exhibited by the microring-resonator filters which achieved the deepest through port extinction and largest free-spectral-range of a functioning high order microring-resonator filter.by Michael Robert Watts.Ph.D

    Optical Delay Interferometers and their Application for Self-coherent Detection

    Get PDF
    Self-coherent receivers are promising candidates for reception of 100 Gbit/s data rates in optical networks. Self-coherent receivers consist of multiple optical delay interferometers (DI) with high-speed photodiodes attached to the outputs. By DSP of the photo currents it becomes possible to receive coherently modulated optical signals. Especially promising for 100 Gbit/s networks is the PolMUX DQPSK format, the self-coherent reception of which is described in detail

    Certification of many-body bosonic interference in 3D photonic chips

    Get PDF
    Quantum information and quantum optics have reached several milestones during the last two decades. Starting from the 1980s, when Feynman and laid the foundations of quantum computation and information, in the last years there have been significant progresses both in theoretical and experimental aspects. A series of quantum algorithms has been proposed that promise computational speed-up with respect to its classical counterpart. If fully exploited, quantum computers are expected to be able to markedly outperform classical ones in several specific tasks. More generally, quantum computers would change the paradigm of what we currently consider efficiently computable, being based on a completely different way to encode and elaborate data, which relies on the unique properties of quantum mechanics such as linear superposition and entanglement. The building block of quantum computation is the qubit, which incorporates in its definition the revolutionary aspects that would enable overcoming classical computation in terms of efficiency and security. However recent developments in technologies claimed the realizations of devices with hundreds of controllable qubits, provoking an important debate of what exactly is a quantum computing process and how to unambiguously recognize the presence of a quantum speed-up. Nevertheless, the question of what exactly makes a quantum computer faster than a classical one has currently no clear answer. Its applications could spread from cryptography, with a significant enhancement in terms of security, to communication and simulation of quantum systems. In particular, in the latter case it was shown by Feynman that some problems in quantum mechanics are intractable by means of only classical approaches, due to the exponential increase in the dimension of the Hilbert space. Clearly the question of where quantum capabilities in computation are significant is still open and the hindrance to answer to these problems brought the scientific community to focus its efforts in trying to develop these kind of systems. As a consequence, significant progresses have been made in trapped ions, superconducting circuits, neutral atoms and linear optics permitting the first implementations of such devices. Among all the scheme introduced, the approach suggested by linear optics, uses photons to encode information and is believed to be promising in most tasks. For instance, photons are important for quantum communication and cryptography protocols because of their natural tendency to behave as "flying" qubits. Moreover, with identical properties (energy, polarization, spatial and temporal profiles), indistinguishable photons can interfere with each other due to their boson nature. These features have a direct application in the task of performing quantum protocols. In fact they are suitable for several recent scheme such as for example graph- and cluster-state photonic quantum computation . In particular, it has been proved that universal quantum computation is possible using only simple optical elements, single photon sources, number resolving photo-detectors and adaptative measurements. thus confirming the pivotal importance of these particles. Although the importance of linear optics has been confirmed in the last decades, its potentialities were already anticipated years before when (1) Burnham et al. discovered the Spontaneous Parametric Down-Conversion, (2) Hong, Ou and Mandel discovered the namesake effect (HOM) and (3) Reck et al. showed how a particular combination of simple optical elements can reproduce any unitary transformation. (1) SPDC consists in the generation of entangled photon pairs through a nonlinear crystal pumped with a strong laser and despite recent advancements in other approaches, it has been the keystone of single photon generation for several years , due to the possibility to create entangled photon pairs with high spectral correlation. (2) The HOM effect demonstrated the tendency of indistinguishable photon pairs to "bunch" in the same output port of a balanced beam splitter, de-facto showing a signature of quantum interference. Finally, (3) the capability to realize any unitary operation in the space of the occupation modes led to the identification of interferometers as pivotal objects for quantum information protocols with linear optics. At this point, once recognized the importance of all these ingredients, linear optics aimed to reach large implementations to perform protocols with a concrete quantum advantage. Unfortunately, the methods exploited by bulk optics suffer of strong mechanical instabilities, which prevent a transition to large-size experiments. The need for both stability and scalability has led to the miniaturization of such bulk optical devices. Several techniques have been employed to reach this goal, such as lithographic processes and implementations on silica materials. All these approaches are significant in terms of stability and ease of manipulation, but they are still expensive in terms of costs and fabrication time and, moreover, they do not permit to exploit the 3D dimension to realize more complex platforms. A powerful approach to transfer linear optical elements on an integrated photonic platform able to overcome these limitations has been recognized in the femtosecond laser micromachining. FLM, developed in the last two decades, exploits the mechanism of non-linear absorption in a medium with focused femtosecond pulses to design arbitrary 3D structures inside an optical substrate. Miniaturized beam splitters and phase shifters are then realized inducing a localized change in the refractive index of the medium. This technique allows to write complex 3D circuits by moving the sample along the desired path at constant velocity, perpendicularly with respect to the laser beam. 3D structures can also be realized either polarization sensitive or insensitive, due to the low birefringence of the material used (borosilicate glass), enabling polarization-encoded qubits and polarization-entangled photons to realize protocol of quantum computation \cite{linda1,linda2}. As a consequence, integrated photonics gives us a starting point to implement quantum simulation processes in a very stable configuration. This feature could pave the way to investigate larger size experiments, where a higher number of photons and optical elements are involved. Recently, it has been suggested that many-particle bosonic interference can be used as a testing tool for the computational power of quantum computers and quantum simulators. Despite the important constraints that we need to satisfy to build a universal quantum computerand perform quantum computation in linear optics, bosonic statistics finds a new promising simpler application in pinpointing the ingredients for a quantum advantage. In this context, an interesting model was recently introduced: the Boson Sampling problem. This model exploits the evolution of indistinguishable bosons into an optical interferometer described by an unitary transformation and it consists in sampling from its output distribution. The core behind this model is the many-body boson interference: although measuring the outcomes seems to be easy to perform, simulating the output of this device, is believed to be intrinsically hard classically in terms of physical resources and time, even approximatively. For this reason Boson Sampling captured the interest of the optical community, which concentrated its efforts to realize experimentally this kind of platforms. This phenomenon can be interpreted as a generalization of the Hong-Ou-Mandel effect of a nn-photon state that interferes into an mm-mode interferometer. In principle, if we are able to reach large dimensions (in n and m), this method can provide the first evidence of quantum over classical advantage and, moreover, it could open the way to the implementation of quantum computation based on quantum interference. Although the path seems promising, this approach has non-trivial drawbacks. First, (a) we need to reach large scale implementations in order to observe quantum advantage, so how can we scale them up? There are two roads that we can follow: (a1) to scale with the number of modes with the techniques developed in integrated photonics, trying to find the best implementation for our interferometers in terms of robustness against losses and choosing the best implementation, or (a2) to scale up the number of photons, identifying appropriate sources for this task. Second, (b) in order to perform quantum protocols we should "trust" on the effective true interference that is supposed to occur the protagonist of the phenomenon. For large-scale implementations, simulating the physical behaviour by means of classical approaches, becomes quickly intractable. In this case the road that we chose is (1) to identify the transformation that are optimal in discriminating true photon interference and (2) to use classification protocols as machine learning techniques and statistical tools to extract information and correlations from output data. Following these premises, the main goal of this thesis is to address a solution to these problems by following the suggested paths. Firstly, we will give an overview of the theoretical and experimental tools used and, secondly, we will show the subsequent analyses that we have carried out. Regarding point \textbf{(a1)} we performed several analyses under broad and realistic conditions. We studied quantitatively the difference between the three known architectures to identify which scheme is more appropriate for the realization of unitary transformations in our interferometers, in terms of scalability and robustness to losses and noise. We also studied the problem comparing our results to the recent developments in integrated photonics. Regarding point (a2) we studied different experimental realizations which seem promising for scaling up both the number of photons and the performances of the quantum device. First, we used multiple SPDC sources to improve the generation rate of single photons. Second, we performed an analysis on the performances of on-demand single-photon sources using a 3-mode integrated photonic circuit and quantum dots as deterministic single photon sources. This investigation has been carried out in a collaboration with the Optic of Semiconductor nanoStructures Group (GOSS) led by Prof. Pascale Senellart in Laboratoire de Photonique et de Nanostructures (C2N). Finally, we focused on problem \textbf{(b)} trying to answer the question of how to validate genuine multi-photon interference in an efficient way. Using optical chips built with FLM we performed several experiments based on protocols suitable for the problem. We performed an analysis on finding the optimal transformations for identifying genuine quantum interference. For this scope, we employed different figures of merit as Total Variation Distance (TVD) and Bayesian tests to exclude alternative hyphotheses on the experimental data. The result of these analysis is the identification of two unitaries which belong to the class of Hadamard matrices, namely the Fourier and Sylvester transformations. Thanks to the unique properties associated to the symmetries of these unitaries, we are able to formalize rules to identify real photon interference, the so-called zero-transmission laws, by looking at specific outputs of the interferometers which are efficiently predictable. Subsequently, we will further investigate on the validation problem by looking at the target from a different perspective. We will exploit two roads: retrieving signatures of quantum interference through machine learning classification techniques and extracting information from the experimental data by means of statistical tools. These approaches are based on choosing training samples from data which are used as reference in order to classify the whole set of output data accordingly, in this case, to its physical behaviour. In this way we are able to rule out against alternative hypotheses not based on true quantum interference
    corecore