21 research outputs found
Certification of many-body bosonic interference in 3D photonic chips
Quantum information and quantum optics have reached several milestones during the last two decades. Starting from the 1980s, when Feynman and laid the foundations of quantum computation and information, in the last years there have been significant progresses both in theoretical and experimental aspects. A series of quantum algorithms has been proposed that promise computational speed-up with respect to its classical counterpart. If fully exploited, quantum computers are expected to be able to markedly outperform classical ones in several specific tasks. More generally, quantum computers would change the paradigm of what we currently consider efficiently computable, being based on a completely different way to encode and elaborate data, which relies on the unique properties of quantum mechanics such as linear superposition and entanglement. The building block of quantum computation is the qubit, which incorporates in its definition the revolutionary aspects that would enable overcoming classical computation in terms of efficiency and security. However recent developments in technologies claimed the realizations of devices with hundreds of controllable qubits, provoking an important debate of what exactly is a quantum computing process and how to unambiguously recognize the presence of a quantum speed-up. Nevertheless, the question of what exactly makes a quantum computer faster than a classical one has currently no clear answer. Its applications could spread from cryptography, with a significant enhancement in terms of security, to communication and simulation of quantum systems. In particular, in the latter case it was shown by Feynman that some problems in quantum mechanics are intractable by means of only classical approaches, due to the exponential increase in the dimension of the Hilbert space. Clearly the question of where quantum capabilities in computation are significant is still open and the hindrance to answer to these problems brought the scientific community to focus its efforts in trying to develop these kind of systems.
As a consequence, significant progresses have been made in trapped ions, superconducting circuits, neutral atoms and linear optics permitting the first implementations of such devices.
Among all the scheme introduced, the approach suggested by linear optics, uses photons to encode information and is believed to be promising in most tasks. For instance, photons are important for quantum communication and cryptography protocols because of their natural tendency to behave as "flying" qubits. Moreover, with identical properties (energy, polarization, spatial and temporal profiles), indistinguishable photons can interfere with each other due to their boson nature. These features have a direct application in the task of performing quantum protocols. In fact they are suitable for several recent scheme such as for example graph- and cluster-state photonic quantum computation . In particular, it has been proved that universal quantum computation is possible using only simple optical elements, single photon sources, number resolving photo-detectors and adaptative measurements.
thus confirming the pivotal importance of these particles. Although the importance of linear optics has been confirmed in the last decades, its potentialities were already anticipated years before when (1) Burnham et al. discovered the Spontaneous Parametric Down-Conversion, (2) Hong, Ou and Mandel discovered the namesake effect (HOM) and (3) Reck et al. showed how a particular combination of simple optical elements can reproduce any unitary transformation. (1) SPDC consists in the generation of entangled photon pairs through a nonlinear crystal pumped with a strong laser and despite recent advancements in other approaches, it has been the keystone of single photon generation for several years , due to the possibility to create entangled photon pairs with high spectral correlation. (2) The HOM effect demonstrated the tendency of indistinguishable photon pairs to "bunch" in the same output port of a balanced beam splitter, de-facto showing a signature of quantum interference. Finally, (3) the capability to realize any unitary operation in the space of the occupation modes led to the identification of interferometers as pivotal objects for quantum information protocols with linear optics.
At this point, once recognized the importance of all these ingredients, linear optics aimed to reach large implementations to perform protocols with a concrete quantum advantage.
Unfortunately, the methods exploited by bulk optics suffer of strong mechanical instabilities, which prevent a transition to large-size experiments. The need for both stability and scalability has led to the miniaturization of such bulk optical devices. Several techniques have been employed to reach this goal, such as lithographic processes and implementations on silica materials. All these approaches are significant in terms of stability and ease of manipulation, but they are still expensive in terms of costs and fabrication time and, moreover, they do not permit to exploit the 3D dimension to realize more complex platforms. A powerful approach to transfer linear optical elements on an integrated photonic platform able to overcome these limitations has been recognized in the femtosecond laser micromachining. FLM, developed in the last two decades, exploits the mechanism of non-linear absorption in a medium with focused femtosecond pulses to design arbitrary 3D structures inside an optical substrate. Miniaturized beam splitters and phase shifters are then realized inducing a localized change in the refractive index of the medium. This technique allows to write complex 3D circuits by moving the sample along the desired path at constant velocity, perpendicularly with respect to the laser beam. 3D structures can also be realized either polarization sensitive or insensitive, due to the low birefringence of the material used (borosilicate glass), enabling polarization-encoded qubits and polarization-entangled photons to realize protocol of quantum computation \cite{linda1,linda2}. As a consequence, integrated photonics gives us a starting point to implement quantum simulation processes in a very stable configuration. This feature could pave the way to investigate larger size experiments, where a higher number of photons and optical elements are involved.
Recently, it has been suggested that many-particle bosonic interference can be used as a testing tool for the computational power of quantum computers and quantum simulators. Despite the important constraints that we need to satisfy to build a universal quantum computerand perform quantum computation in linear optics, bosonic statistics finds a new promising simpler application in pinpointing the ingredients for a quantum advantage. In this context, an interesting model was recently introduced: the Boson Sampling problem. This model exploits the evolution of indistinguishable bosons into an optical interferometer described by an unitary transformation and it consists in sampling from its output distribution. The core behind this model is the many-body boson interference: although measuring the outcomes seems to be easy to perform, simulating the output of this device, is believed to be intrinsically hard classically in terms of physical resources and time, even approximatively. For this reason Boson Sampling captured the interest of the optical community, which concentrated its efforts to realize experimentally this kind of platforms. This phenomenon can be interpreted as a generalization of the Hong-Ou-Mandel effect of a -photon state that interferes into an -mode interferometer. In principle, if we are able to reach large dimensions (in n and m), this method can provide the first evidence of quantum over classical advantage and, moreover, it could open the way to the implementation of quantum computation based on quantum interference. Although the path seems promising, this approach has non-trivial drawbacks. First, (a) we need to reach large scale implementations in order to observe quantum advantage, so how can we scale them up? There are two roads that we can follow: (a1) to scale with the number of modes with the techniques developed in integrated photonics, trying to find the best implementation for our interferometers in terms of robustness against losses and choosing the best implementation, or (a2) to scale up the number of photons, identifying appropriate sources for this task. Second, (b) in order to perform quantum protocols we should "trust" on the effective true interference that is supposed to occur the protagonist of the phenomenon. For large-scale implementations, simulating the physical behaviour by means of classical approaches, becomes quickly intractable. In this case the road that we chose is (1) to identify the transformation that are optimal in discriminating true photon interference and (2) to use classification protocols as machine learning techniques and statistical tools to extract information and correlations from output data.
Following these premises, the main goal of this thesis is to address a solution to these problems by following the suggested paths. Firstly, we will give an overview of the theoretical and experimental tools used and, secondly, we will show the subsequent analyses that we have carried out. Regarding point \textbf{(a1)} we performed several analyses under broad and realistic conditions. We studied quantitatively the difference between the three known architectures to identify which scheme is more appropriate for the realization of unitary transformations in our interferometers, in terms of scalability and robustness to losses and noise. We also studied the problem comparing our results to the recent developments in integrated photonics. Regarding point (a2) we studied different experimental realizations which seem promising for scaling up both the number of photons and the performances of the quantum device. First, we used multiple SPDC sources to improve the generation rate of single photons. Second, we performed an analysis on the performances of on-demand single-photon sources using a 3-mode integrated photonic circuit and quantum dots as deterministic single photon sources. This investigation has been carried out in a collaboration with the Optic of Semiconductor nanoStructures Group (GOSS) led by Prof. Pascale Senellart in Laboratoire de Photonique et de Nanostructures (C2N).
Finally, we focused on problem \textbf{(b)} trying to answer the question of how to validate genuine multi-photon interference in an efficient way. Using optical chips built with FLM we performed several experiments based on protocols suitable for the problem. We performed an analysis on finding the optimal transformations for identifying genuine quantum interference. For this scope, we employed different figures of merit as Total Variation Distance (TVD) and Bayesian tests to exclude alternative hyphotheses on the experimental data. The result of these analysis is the identification of two unitaries which belong to the class of Hadamard matrices, namely the Fourier and Sylvester transformations. Thanks to the unique properties associated to the symmetries of these unitaries, we are able to formalize rules to identify real photon interference, the so-called zero-transmission laws, by looking at specific outputs of the interferometers which are efficiently predictable. Subsequently, we will further investigate on the validation problem by looking at the target from a different perspective. We will exploit two roads: retrieving signatures of quantum interference through machine learning classification techniques and extracting information from the experimental data by means of statistical tools. These approaches are based on choosing training samples from data which are used as reference in order to classify the whole set of output data accordingly, in this case, to its physical behaviour. In this way we are able to rule out against alternative hypotheses not based on true quantum interference
Benchmarking integrated photonic architectures
Photonic platforms represent a promising technology for the realization of
several quantum communication protocols and for experiments of quantum
simulation. Moreover, large-scale integrated interferometers have recently
gained a relevant role for restricted models of quantum computing, specifically
with Boson Sampling devices. Indeed, various linear optical schemes have been
proposed for the implementation of unitary transformations, each one suitable
for a specific task. Notwithstanding, so far a comprehensive analysis of the
state of the art under broader and realistic conditions is still lacking. In
the present work we address this gap, providing in a unified framework a
quantitative comparison of the three main photonic architectures, namely the
ones with triangular and square designs and the so-called fast transformations.
All layouts have been analyzed in presence of losses and imperfect control over
the reflectivities and phases of the inner structure. Our results represent a
further step ahead towards the implementation of quantum information protocols
on large-scale integrated photonic devices.Comment: 10 pages, 6 figures + 2 pages Supplementary Informatio
Optimal photonic indistinguishability tests in multimode networks
Particle indistinguishability is at the heart of quantum statistics that
regulates fundamental phenomena such as the electronic band structure of
solids, Bose-Einstein condensation and superconductivity. Moreover, it is
necessary in practical applications such as linear optical quantum computation
and simulation, in particular for Boson Sampling devices. It is thus crucial to
develop tools to certify genuine multiphoton interference between multiple
sources. Here we show that so-called Sylvester interferometers are near-optimal
for the task of discriminating the behaviors of distinguishable and
indistinguishable photons. We report the first implementations of integrated
Sylvester interferometers with 4 and 8 modes with an efficient, scalable and
reliable 3D-architecture. We perform two-photon interference experiments
capable of identifying indistinguishable photon behaviour with a Bayesian
approach using very small data sets. Furthermore, we employ experimentally this
new device for the assessment of scattershot Boson Sampling. These results open
the way to the application of Sylvester interferometers for the optimal
assessment of multiphoton interference experiments.Comment: 9+10 pages, 6+6 figures, added supplementary material, completed and
updated bibliograph
Experimental generalized quantum suppression law in Sylvester interferometers
Photonic interference is a key quantum resource for optical quantum
computation, and in particular for so-called boson sampling machines. In
interferometers with certain symmetries, genuine multiphoton quantum
interference effectively suppresses certain sets of events, as in the original
Hong-Ou-Mandel effect. Recently, it was shown that some classical and
semi-classical models could be ruled out by identifying such suppressions in
Fourier interferometers. Here we propose a suppression law suitable for
random-input experiments in multimode Sylvester interferometers, and verify it
experimentally using 4- and 8-mode integrated interferometers. The observed
suppression is stronger than what is observed in Fourier interferometers of the
same size, and could be relevant to certification of boson sampling machines
and other experiments relying on bosonic interference.Comment: 5 pages, 3 figures + 11 pages, 3 figures Supplementary Informatio
Experimental Scattershot Boson Sampling
Boson Sampling is a computational task strongly believed to be hard for
classical computers, but efficiently solvable by orchestrated bosonic
interference in a specialised quantum computer. Current experimental schemes,
however, are still insufficient for a convincing demonstration of the advantage
of quantum over classical computation. A new variation of this task,
Scattershot Boson Sampling, leads to an exponential increase in speed of the
quantum device, using a larger number of photon sources based on parametric
downconversion. This is achieved by having multiple heralded single photons
being sent, shot by shot, into different random input ports of the
interferometer. Here we report the first Scattershot Boson Sampling
experiments, where six different photon-pair sources are coupled to integrated
photonic circuits. We employ recently proposed statistical tools to analyse our
experimental data, providing strong evidence that our photonic quantum
simulator works as expected. This approach represents an important leap toward
a convincing experimental demonstration of the quantum computational supremacy.Comment: 8 pages, 5 figures (plus Supplementary Materials, 14 pages, 8
figures
Experimental quantification of genuine four-photon indistinguishability
Photon indistinguishability plays a fundamental role in information
processing, with applications such as linear-optical quantum computation and
metrology. It is then necessary to develop appropriate tools to quantify the
amount of this resource in a multiparticle scenario. Here we report a
four-photon experiment in a linear-optical interferometer designed to
simultaneously estimate the degree of indistinguishability between three pairs
of photons. The interferometer design dispenses with the need of heralding for
parametric down-conversion sources, resulting in an efficient and reliable
optical scheme. We then use a recently proposed theoretical framework to
quantify genuine four-photon indistinguishability, as well as to obtain bounds
on three unmeasured two-photon overlaps. Our findings are in high agreement
with the theory, and represent a new resource-effective technique for the
characterization of multiphoton interference.Comment: 8+8 pages, 5+5 figure
Suppression law of quantum states in a 3D photonic fast Fourier transform chip
The identification of phenomena able to pinpoint quantum interference is attracting large interest. Indeed, a generalization of the Hong-Ou-Mandel effect valid for any number of photons and optical modes would represent an important leap ahead both from a fundamental perspective and for practical applications, such as certification of photonic quantum devices, whose computational speedup is expected to depend critically on multi-particle interference. Quantum distinctive features have been predicted for many particles injected into multimode interferometers implementing the Fourier transform over the optical modes. Here we develop a scalable approach for the implementation of the fast Fourier transform algorithm using three-dimensional photonic integrated interferometers, fabricated via femtosecond laser writing technique. We observe the suppression law for a large number of output states with four- and eight-mode optical circuits: the experimental results demonstrate genuine quantum interference between the injected photons, thus offering a powerful tool for diagnostic of photonic platforms
Witnessing genuine multiphoton indistinguishability
Bosonic interference is a fundamental physical phenomenon, and it is believed to lie at the heart of quantum computational advantage. It is thus necessary to develop practical tools to witness its presence, both for a reliable assessment of a quantum source and for fundamental investigations. Here we describe how linear interferometers can be used to unambiguously witness genuine n-boson indistinguishability. The amount of violation of the proposed witnesses bounds the degree of multiboson indistinguishability, for which we also provide a novel intuitive model using set theory. We experimentally implement this test to bound the degree of three-photon indistinguishability in states we prepare using parametric down-conversion. Our approach results in a convenient tool for practical photonic applications, and may inspire further fundamental advances based on the operational framework we adopt