8,331 research outputs found

    Approximating Likelihood Ratios with Calibrated Discriminative Classifiers

    Full text link
    In many fields of science, generalized likelihood ratio tests are established tools for statistical inference. At the same time, it has become increasingly common that a simulator (or generative model) is used to describe complex processes that tie parameters θ\theta of an underlying theory and measurement apparatus to high-dimensional observations x∈Rp\mathbf{x}\in \mathbb{R}^p. However, simulator often do not provide a way to evaluate the likelihood function for a given observation x\mathbf{x}, which motivates a new class of likelihood-free inference algorithms. In this paper, we show that likelihood ratios are invariant under a specific class of dimensionality reduction maps Rp↦R\mathbb{R}^p \mapsto \mathbb{R}. As a direct consequence, we show that discriminative classifiers can be used to approximate the generalized likelihood ratio statistic when only a generative model for the data is available. This leads to a new machine learning-based approach to likelihood-free inference that is complementary to Approximate Bayesian Computation, and which does not require a prior on the model parameters. Experimental results on artificial problems with known exact likelihoods illustrate the potential of the proposed method.Comment: 35 pages, 5 figure

    Testing quantum mechanics: a statistical approach

    Full text link
    As experiments continue to push the quantum-classical boundary using increasingly complex dynamical systems, the interpretation of experimental data becomes more and more challenging: when the observations are noisy, indirect, and limited, how can we be sure that we are observing quantum behavior? This tutorial highlights some of the difficulties in such experimental tests of quantum mechanics, using optomechanics as the central example, and discusses how the issues can be resolved using techniques from statistics and insights from quantum information theory.Comment: v1: 2 pages; v2: invited tutorial for Quantum Measurements and Quantum Metrology, substantial expansion of v1, 19 pages; v3: accepted; v4: corrected some errors, publishe

    The Emergence of Gravitational Wave Science: 100 Years of Development of Mathematical Theory, Detectors, Numerical Algorithms, and Data Analysis Tools

    Get PDF
    On September 14, 2015, the newly upgraded Laser Interferometer Gravitational-wave Observatory (LIGO) recorded a loud gravitational-wave (GW) signal, emitted a billion light-years away by a coalescing binary of two stellar-mass black holes. The detection was announced in February 2016, in time for the hundredth anniversary of Einstein's prediction of GWs within the theory of general relativity (GR). The signal represents the first direct detection of GWs, the first observation of a black-hole binary, and the first test of GR in its strong-field, high-velocity, nonlinear regime. In the remainder of its first observing run, LIGO observed two more signals from black-hole binaries, one moderately loud, another at the boundary of statistical significance. The detections mark the end of a decades-long quest, and the beginning of GW astronomy: finally, we are able to probe the unseen, electromagnetically dark Universe by listening to it. In this article, we present a short historical overview of GW science: this young discipline combines GR, arguably the crowning achievement of classical physics, with record-setting, ultra-low-noise laser interferometry, and with some of the most powerful developments in the theory of differential geometry, partial differential equations, high-performance computation, numerical analysis, signal processing, statistical inference, and data science. Our emphasis is on the synergy between these disciplines, and how mathematics, broadly understood, has historically played, and continues to play, a crucial role in the development of GW science. We focus on black holes, which are very pure mathematical solutions of Einstein's gravitational-field equations that are nevertheless realized in Nature, and that provided the first observed signals.Comment: 41 pages, 5 figures. To appear in Bulletin of the American Mathematical Societ

    Maximal adaptive-decision speedups in quantum-state readout

    Full text link
    The average time TT required for high-fidelity readout of quantum states can be significantly reduced via a real-time adaptive decision rule. An adaptive decision rule stops the readout as soon as a desired level of confidence has been achieved, as opposed to setting a fixed readout time tft_f. The performance of the adaptive decision is characterized by the "adaptive-decision speedup," tf/Tt_f/T. In this work, we reformulate this readout problem in terms of the first-passage time of a particle undergoing stochastic motion. This formalism allows us to theoretically establish the maximum achievable adaptive-decision speedups for several physical two-state readout implementations. We show that for two common readout schemes (the Gaussian latching readout and a readout relying on state-dependent decay), the speedup is bounded by 44 and 22, respectively, in the limit of high single-shot readout fidelity. We experimentally study the achievable speedup in a real-world scenario by applying the adaptive decision rule to a readout of the nitrogen-vacancy-center (NV-center) charge state. We find a speedup of ≈2\approx 2 with our experimental parameters. In addition, we propose a simple readout scheme for which the speedup can, in principle, be increased without bound as the fidelity is increased. Our results should lead to immediate improvements in nanoscale magnetometry based on spin-to-charge conversion of the NV-center spin, and provide a theoretical framework for further optimization of the bandwidth of quantum measurements.Comment: 18 pages, 11 figures. This version is close to the published versio

    Quantum Tomography

    Get PDF
    This is the draft version of a review paper which is going to appear in "Advances in Imaging and Electron Physics"Comment: To appear in "Advances in Imaging and Electron Physics". Some figs with low resolutio

    Quantum metrology with nonclassical states of atomic ensembles

    Full text link
    Quantum technologies exploit entanglement to revolutionize computing, measurements, and communications. This has stimulated the research in different areas of physics to engineer and manipulate fragile many-particle entangled states. Progress has been particularly rapid for atoms. Thanks to the large and tunable nonlinearities and the well developed techniques for trapping, controlling and counting, many groundbreaking experiments have demonstrated the generation of entangled states of trapped ions, cold and ultracold gases of neutral atoms. Moreover, atoms can couple strongly to external forces and light fields, which makes them ideal for ultra-precise sensing and time keeping. All these factors call for generating non-classical atomic states designed for phase estimation in atomic clocks and atom interferometers, exploiting many-body entanglement to increase the sensitivity of precision measurements. The goal of this article is to review and illustrate the theory and the experiments with atomic ensembles that have demonstrated many-particle entanglement and quantum-enhanced metrology.Comment: 76 pages, 40 figures, 1 table, 603 references. Some figures bitmapped at 300 dpi to reduce file siz

    PeX 1. Multi-spectral expansion of residual speckles for planet detection

    Full text link
    The detection of exoplanets in coronographic images is severely limited by residual starlight speckles. Dedicated post-processing can drastically reduce this "stellar leakage" and thereby increase the faintness of detectable exoplanets. Based on a multi-spectral series expansion of the diffraction pattern, we derive a multi-mode model of the residuals which can be exploited to estimate and thus remove the residual speckles in multi-spectral coronographic images. Compared to other multi-spectral processing methods, our model is physically grounded and is suitable for use in an (optimal) inverse approach. We demonstrate the ability of our model to correctly estimate the speckles in simulated data and demonstrate that very high contrasts can be achieved. We further apply our method to removing speckles from a real data cube obtained with the SPHERE IFS instrument.Comment: accepted for publication in MNRAS on 25th of August 2017, 17 pages, 15 figure
    • …
    corecore