15 research outputs found

    Dynamic Acoustic Control of Semiconductor Quantum Dot-Based Devices for Quantum Light Generation

    Get PDF
    This thesis presents work on a series of devices for the generation of photonic quantum states based on self-assembled InAs quantum dots, which are among the most technologically mature candidates for practical quantum photonic applications due to their high internal quantum efficiency, narrow linewidth, tunability and straightforward integration with photonic and electric components. The primary results presented concern sources of multi-photon entangled states and single-photon sources with high repetition rate, both of which are crucial components for emerging photonic quantum technologies. First, we propose a scheme for the sequential generation of entangled photon chains by resonant scattering of a laser field on a single charged particle in a cavity-enhanced quantum dot. The charge has an associated spin that can determine the time bin of a photon, allowing for information encoding in this degree of freedom. We demonstrate coherent operations on this spin and realize a proof-of-principle experiment of the proposed scheme by showing that the time bin of a single-photon is dependent on the measured state of the trapped spin. The second main avenue of work investigates the effects of a surface acoustic wave, a mechanical displacement wave confined to the surface of a substrate, on the optical properties of quantum dots. In particular, we exploit the dynamic acoustically-induced tuning of the emission energy to modulate the Purcell effect in a pillar microcavity. Under resonant optical excitation we demonstrate the conversion of the continuous wave laser into a pulsed single-photon stream inheriting the acoustic frequency of 1 GHz as the repetition rate. High resolution spectroscopy reveals the presence of narrow sidebands in the emission spectrum, whose relative intensity can be controlled by the acoustic power and laser detuning. Furthermore, we develop a platform for analogous in-plane experiments by transferring GaAs membranes hosting quantum dots onto LiNbO3 substrates and patterning them into whispering gallery mode optical resonators. In addition to Purcell enhancement and acoustic tuning of the emission, the devices exhibit strong localized mechanical resonances. Finally, we perform initial experiments on the effects of a surface acoustic wave on the spin of a charge trapped in a quantum dot. We integrate acoustic transducers with charge-tunable diodes, where the charge state of the dot can be precisely controlled by an applied bias voltage, and demonstrate the frustration of optical spin pumping by the acoustic wave.This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No 642688

    Annual report / IFW, Leibniz-Institut für Festkörper- und Werkstoffforschung Dresden

    Get PDF

    Understanding Quantum Technologies 2022

    Full text link
    Understanding Quantum Technologies 2022 is a creative-commons ebook that provides a unique 360 degrees overview of quantum technologies from science and technology to geopolitical and societal issues. It covers quantum physics history, quantum physics 101, gate-based quantum computing, quantum computing engineering (including quantum error corrections and quantum computing energetics), quantum computing hardware (all qubit types, including quantum annealing and quantum simulation paradigms, history, science, research, implementation and vendors), quantum enabling technologies (cryogenics, control electronics, photonics, components fabs, raw materials), quantum computing algorithms, software development tools and use cases, unconventional computing (potential alternatives to quantum and classical computing), quantum telecommunications and cryptography, quantum sensing, quantum technologies around the world, quantum technologies societal impact and even quantum fake sciences. The main audience are computer science engineers, developers and IT specialists as well as quantum scientists and students who want to acquire a global view of how quantum technologies work, and particularly quantum computing. This version is an extensive update to the 2021 edition published in October 2021.Comment: 1132 pages, 920 figures, Letter forma

    Single-photon detectors integrated in quantum photonic circuits

    Get PDF
    Toward photonic circuits for quantum computer

    Certification of many-body bosonic interference in 3D photonic chips

    Get PDF
    Quantum information and quantum optics have reached several milestones during the last two decades. Starting from the 1980s, when Feynman and laid the foundations of quantum computation and information, in the last years there have been significant progresses both in theoretical and experimental aspects. A series of quantum algorithms has been proposed that promise computational speed-up with respect to its classical counterpart. If fully exploited, quantum computers are expected to be able to markedly outperform classical ones in several specific tasks. More generally, quantum computers would change the paradigm of what we currently consider efficiently computable, being based on a completely different way to encode and elaborate data, which relies on the unique properties of quantum mechanics such as linear superposition and entanglement. The building block of quantum computation is the qubit, which incorporates in its definition the revolutionary aspects that would enable overcoming classical computation in terms of efficiency and security. However recent developments in technologies claimed the realizations of devices with hundreds of controllable qubits, provoking an important debate of what exactly is a quantum computing process and how to unambiguously recognize the presence of a quantum speed-up. Nevertheless, the question of what exactly makes a quantum computer faster than a classical one has currently no clear answer. Its applications could spread from cryptography, with a significant enhancement in terms of security, to communication and simulation of quantum systems. In particular, in the latter case it was shown by Feynman that some problems in quantum mechanics are intractable by means of only classical approaches, due to the exponential increase in the dimension of the Hilbert space. Clearly the question of where quantum capabilities in computation are significant is still open and the hindrance to answer to these problems brought the scientific community to focus its efforts in trying to develop these kind of systems. As a consequence, significant progresses have been made in trapped ions, superconducting circuits, neutral atoms and linear optics permitting the first implementations of such devices. Among all the scheme introduced, the approach suggested by linear optics, uses photons to encode information and is believed to be promising in most tasks. For instance, photons are important for quantum communication and cryptography protocols because of their natural tendency to behave as "flying" qubits. Moreover, with identical properties (energy, polarization, spatial and temporal profiles), indistinguishable photons can interfere with each other due to their boson nature. These features have a direct application in the task of performing quantum protocols. In fact they are suitable for several recent scheme such as for example graph- and cluster-state photonic quantum computation . In particular, it has been proved that universal quantum computation is possible using only simple optical elements, single photon sources, number resolving photo-detectors and adaptative measurements. thus confirming the pivotal importance of these particles. Although the importance of linear optics has been confirmed in the last decades, its potentialities were already anticipated years before when (1) Burnham et al. discovered the Spontaneous Parametric Down-Conversion, (2) Hong, Ou and Mandel discovered the namesake effect (HOM) and (3) Reck et al. showed how a particular combination of simple optical elements can reproduce any unitary transformation. (1) SPDC consists in the generation of entangled photon pairs through a nonlinear crystal pumped with a strong laser and despite recent advancements in other approaches, it has been the keystone of single photon generation for several years , due to the possibility to create entangled photon pairs with high spectral correlation. (2) The HOM effect demonstrated the tendency of indistinguishable photon pairs to "bunch" in the same output port of a balanced beam splitter, de-facto showing a signature of quantum interference. Finally, (3) the capability to realize any unitary operation in the space of the occupation modes led to the identification of interferometers as pivotal objects for quantum information protocols with linear optics. At this point, once recognized the importance of all these ingredients, linear optics aimed to reach large implementations to perform protocols with a concrete quantum advantage. Unfortunately, the methods exploited by bulk optics suffer of strong mechanical instabilities, which prevent a transition to large-size experiments. The need for both stability and scalability has led to the miniaturization of such bulk optical devices. Several techniques have been employed to reach this goal, such as lithographic processes and implementations on silica materials. All these approaches are significant in terms of stability and ease of manipulation, but they are still expensive in terms of costs and fabrication time and, moreover, they do not permit to exploit the 3D dimension to realize more complex platforms. A powerful approach to transfer linear optical elements on an integrated photonic platform able to overcome these limitations has been recognized in the femtosecond laser micromachining. FLM, developed in the last two decades, exploits the mechanism of non-linear absorption in a medium with focused femtosecond pulses to design arbitrary 3D structures inside an optical substrate. Miniaturized beam splitters and phase shifters are then realized inducing a localized change in the refractive index of the medium. This technique allows to write complex 3D circuits by moving the sample along the desired path at constant velocity, perpendicularly with respect to the laser beam. 3D structures can also be realized either polarization sensitive or insensitive, due to the low birefringence of the material used (borosilicate glass), enabling polarization-encoded qubits and polarization-entangled photons to realize protocol of quantum computation \cite{linda1,linda2}. As a consequence, integrated photonics gives us a starting point to implement quantum simulation processes in a very stable configuration. This feature could pave the way to investigate larger size experiments, where a higher number of photons and optical elements are involved. Recently, it has been suggested that many-particle bosonic interference can be used as a testing tool for the computational power of quantum computers and quantum simulators. Despite the important constraints that we need to satisfy to build a universal quantum computerand perform quantum computation in linear optics, bosonic statistics finds a new promising simpler application in pinpointing the ingredients for a quantum advantage. In this context, an interesting model was recently introduced: the Boson Sampling problem. This model exploits the evolution of indistinguishable bosons into an optical interferometer described by an unitary transformation and it consists in sampling from its output distribution. The core behind this model is the many-body boson interference: although measuring the outcomes seems to be easy to perform, simulating the output of this device, is believed to be intrinsically hard classically in terms of physical resources and time, even approximatively. For this reason Boson Sampling captured the interest of the optical community, which concentrated its efforts to realize experimentally this kind of platforms. This phenomenon can be interpreted as a generalization of the Hong-Ou-Mandel effect of a nn-photon state that interferes into an mm-mode interferometer. In principle, if we are able to reach large dimensions (in n and m), this method can provide the first evidence of quantum over classical advantage and, moreover, it could open the way to the implementation of quantum computation based on quantum interference. Although the path seems promising, this approach has non-trivial drawbacks. First, (a) we need to reach large scale implementations in order to observe quantum advantage, so how can we scale them up? There are two roads that we can follow: (a1) to scale with the number of modes with the techniques developed in integrated photonics, trying to find the best implementation for our interferometers in terms of robustness against losses and choosing the best implementation, or (a2) to scale up the number of photons, identifying appropriate sources for this task. Second, (b) in order to perform quantum protocols we should "trust" on the effective true interference that is supposed to occur the protagonist of the phenomenon. For large-scale implementations, simulating the physical behaviour by means of classical approaches, becomes quickly intractable. In this case the road that we chose is (1) to identify the transformation that are optimal in discriminating true photon interference and (2) to use classification protocols as machine learning techniques and statistical tools to extract information and correlations from output data. Following these premises, the main goal of this thesis is to address a solution to these problems by following the suggested paths. Firstly, we will give an overview of the theoretical and experimental tools used and, secondly, we will show the subsequent analyses that we have carried out. Regarding point \textbf{(a1)} we performed several analyses under broad and realistic conditions. We studied quantitatively the difference between the three known architectures to identify which scheme is more appropriate for the realization of unitary transformations in our interferometers, in terms of scalability and robustness to losses and noise. We also studied the problem comparing our results to the recent developments in integrated photonics. Regarding point (a2) we studied different experimental realizations which seem promising for scaling up both the number of photons and the performances of the quantum device. First, we used multiple SPDC sources to improve the generation rate of single photons. Second, we performed an analysis on the performances of on-demand single-photon sources using a 3-mode integrated photonic circuit and quantum dots as deterministic single photon sources. This investigation has been carried out in a collaboration with the Optic of Semiconductor nanoStructures Group (GOSS) led by Prof. Pascale Senellart in Laboratoire de Photonique et de Nanostructures (C2N). Finally, we focused on problem \textbf{(b)} trying to answer the question of how to validate genuine multi-photon interference in an efficient way. Using optical chips built with FLM we performed several experiments based on protocols suitable for the problem. We performed an analysis on finding the optimal transformations for identifying genuine quantum interference. For this scope, we employed different figures of merit as Total Variation Distance (TVD) and Bayesian tests to exclude alternative hyphotheses on the experimental data. The result of these analysis is the identification of two unitaries which belong to the class of Hadamard matrices, namely the Fourier and Sylvester transformations. Thanks to the unique properties associated to the symmetries of these unitaries, we are able to formalize rules to identify real photon interference, the so-called zero-transmission laws, by looking at specific outputs of the interferometers which are efficiently predictable. Subsequently, we will further investigate on the validation problem by looking at the target from a different perspective. We will exploit two roads: retrieving signatures of quantum interference through machine learning classification techniques and extracting information from the experimental data by means of statistical tools. These approaches are based on choosing training samples from data which are used as reference in order to classify the whole set of output data accordingly, in this case, to its physical behaviour. In this way we are able to rule out against alternative hypotheses not based on true quantum interference

    Report / Institute fĂĽr Physik

    Get PDF
    The 2015 Report of the Physics Institutes of the Universität Leipzig presents an interesting overview of our research activities in the past year. It is also testimony of our scientific interaction with colleagues and partners worldwide
    corecore