1,983 research outputs found

    Entanglement criteria for microscopic-macroscopic systems

    Full text link
    We discuss the conclusions that can be drawn on a recent experimental micro-macro entanglement test [F. De Martini, F. Sciarrino, and C. Vitelli, Phys. Rev. Lett. 100, 253601 (2008). The system under investigation is generated through optical parametric amplification of one photon belonging to an entangled pair. The adopted entanglement criterion makes it possible to infer the presence of entanglement before losses, that occur on the macrostate, under a specific assumption. In particular, an a priori knowledge of the system that generates the micro-macro pair is necessary to exclude a class of separable states that can reproduce the obtained experimental results. Finally, we discuss the feasibility of a micro-macro "genuine" entanglement test on the analyzed system by considering different strategies, which show that in principle a fraction epsilon, proportional to the number of photons that survive the lossy process, of the original entanglement persists in any losses regime.Comment: 11 pages, 10 figure

    Certification of Gaussian Boson Sampling via graphs feature vectors and kernels

    Get PDF
    Gaussian Boson Sampling (GBS) is a non-universal model for quantum computing inspired by the original formulation of the Boson Sampling (BS) problem. Nowadays, it represents a paradigmatic quantum platform to reach the quantum advantage regime in a specific computational model. Indeed, thanks to the implementation in photonics-based processors, the latest GBS experiments have reached a level of complexity where the quantum apparatus has solved the task faster than currently up-to-date classical strategies. In addition, recent studies have identified possible applications beyond the inherent sampling task. In particular, a direct connection between photon counting of a genuine GBS device and the number of perfect matchings in a graph has been established. In this work, we propose to exploit such a connection to benchmark GBS experiments. We interpret the properties of the feature vectors of the graph encoded in the device as a signature of correct sampling from the true input state. Within this framework, two approaches are presented. The first method exploits the distributions of graph feature vectors and classification via neural networks. The second approach investigates the distributions of graph kernels. Our results provide a novel approach to the actual need for tailored algorithms to benchmark large-scale Gaussian Boson Samplers

    Deep reinforcement learning for quantum multiparameter estimation

    Get PDF
    Estimation of physical quantities is at the core of most scientific research, and the use of quantum devices promises to enhance its performances. In real scenarios, it is fundamental to consider that resources are limited, and Bayesian adaptive estimation represents a powerful approach to efficiently allocate, during the estimation process, all the available resources. However, this framework relies on the precise knowledge of the system model, retrieved with a fine calibration, with results that are often computationally and experimentally demanding. We introduce a model-free and deep-learning-based approach to efficiently implement realistic Bayesian quantum metrology tasks accomplishing all the relevant challenges, without relying on any a priori knowledge of the system. To overcome this need, a neural network is trained directly on experimental data to learn the multiparameter Bayesian update. Then the system is set at its optimal working point through feedback provided by a reinforcement learning algorithm trained to reconstruct and enhance experiment heuristics of the investigated quantum sensor. Notably, we prove experimentally the achievement of higher estimation performances than standard methods, demonstrating the strength of the combination of these two black-box algorithms on an integrated photonic circuit. Our work represents an important step toward fully artificial intelligence-based quantum metrology

    Quantifying n-Photon Indistinguishability with a Cyclic Integrated Interferometer

    Get PDF
    We report on a universal method to measure the genuine indistinguishability of n photons—a crucial parameter that determines the accuracy of optical quantum computing. Our approach relies on a low-depth cyclic multiport interferometer with N=2n modes, leading to a quantum interference fringe whose visibility is a direct measurement of the genuine n-photon indistinguishability. We experimentally demonstrate this technique for an eight-mode integrated interferometer fabricated using femtosecond laser micromachining and four photons from a quantum dot single-photon source. We measure a fourphoton indistinguishability up to 0.81 +- 0.03. This value decreases as we intentionally alter the photon pairwise indistinguishability. The low-depth and low-loss multiport interferometer design provides an original path to evaluate the genuine indistinguishability of resource states of increasing photon number

    Differential cross section measurements for the production of a W boson in association with jets in proton–proton collisions at √s = 7 TeV

    Get PDF
    Measurements are reported of differential cross sections for the production of a W boson, which decays into a muon and a neutrino, in association with jets, as a function of several variables, including the transverse momenta (pT) and pseudorapidities of the four leading jets, the scalar sum of jet transverse momenta (HT), and the difference in azimuthal angle between the directions of each jet and the muon. The data sample of pp collisions at a centre-of-mass energy of 7 TeV was collected with the CMS detector at the LHC and corresponds to an integrated luminosity of 5.0 fb[superscript −1]. The measured cross sections are compared to predictions from Monte Carlo generators, MadGraph + pythia and sherpa, and to next-to-leading-order calculations from BlackHat + sherpa. The differential cross sections are found to be in agreement with the predictions, apart from the pT distributions of the leading jets at high pT values, the distributions of the HT at high-HT and low jet multiplicity, and the distribution of the difference in azimuthal angle between the leading jet and the muon at low values.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Juxtaposing BTE and ATE – on the role of the European insurance industry in funding civil litigation

    Get PDF
    One of the ways in which legal services are financed, and indeed shaped, is through private insurance arrangement. Two contrasting types of legal expenses insurance contracts (LEI) seem to dominate in Europe: before the event (BTE) and after the event (ATE) legal expenses insurance. Notwithstanding institutional differences between different legal systems, BTE and ATE insurance arrangements may be instrumental if government policy is geared towards strengthening a market-oriented system of financing access to justice for individuals and business. At the same time, emphasizing the role of a private industry as a keeper of the gates to justice raises issues of accountability and transparency, not readily reconcilable with demands of competition. Moreover, multiple actors (clients, lawyers, courts, insurers) are involved, causing behavioural dynamics which are not easily predicted or influenced. Against this background, this paper looks into BTE and ATE arrangements by analysing the particularities of BTE and ATE arrangements currently available in some European jurisdictions and by painting a picture of their respective markets and legal contexts. This allows for some reflection on the performance of BTE and ATE providers as both financiers and keepers. Two issues emerge from the analysis that are worthy of some further reflection. Firstly, there is the problematic long-term sustainability of some ATE products. Secondly, the challenges faced by policymakers that would like to nudge consumers into voluntarily taking out BTE LEI

    Penilaian Kinerja Keuangan Koperasi di Kabupaten Pelalawan

    Full text link
    This paper describe development and financial performance of cooperative in District Pelalawan among 2007 - 2008. Studies on primary and secondary cooperative in 12 sub-districts. Method in this stady use performance measuring of productivity, efficiency, growth, liquidity, and solvability of cooperative. Productivity of cooperative in Pelalawan was highly but efficiency still low. Profit and income were highly, even liquidity of cooperative very high, and solvability was good

    Search for stop and higgsino production using diphoton Higgs boson decays

    Get PDF
    Results are presented of a search for a "natural" supersymmetry scenario with gauge mediated symmetry breaking. It is assumed that only the supersymmetric partners of the top-quark (stop) and the Higgs boson (higgsino) are accessible. Events are examined in which there are two photons forming a Higgs boson candidate, and at least two b-quark jets. In 19.7 inverse femtobarns of proton-proton collision data at sqrt(s) = 8 TeV, recorded in the CMS experiment, no evidence of a signal is found and lower limits at the 95% confidence level are set, excluding the stop mass below 360 to 410 GeV, depending on the higgsino mass
    corecore