92 research outputs found

    The best Fisher is upstream: data processing inequalities for quantum metrology

    Full text link
    We apply the classical data processing inequality to quantum metrology to show that manipulating the classical information from a quantum measurement cannot aid in the estimation of parameters encoded in quantum states. We further derive a quantum data processing inequality to show that coherent manipulation of quantum data also cannot improve the precision in estimation. In addition, we comment on the assumptions necessary to arrive at these inequalities and how they might be avoided providing insights into enhancement procedures which are not provably wrong.Comment: Comments encourage

    Self-guided quantum tomography

    Full text link
    We introduce a self-learning tomographic technique in which the experiment guides itself to an estimate of its own state. Self-guided quantum tomography (SGQT) uses measurements to directly test hypotheses in an iterative algorithm which converges to the true state. We demonstrate through simulation on many qubits that SGQT is a more efficient and robust alternative to the usual paradigm of taking a large amount of informationally complete data and solving the inverse problem of post-processed state estimation.Comment: v2: published versio

    Quantum Model Averaging

    Full text link
    Standard tomographic analyses ignore model uncertainty. It is assumed that a given model generated the data and the task is to estimate the quantum state, or a subset of parameters within that model. Here we apply a model averaging technique to mitigate the risk of overconfident estimates of model parameters in two examples: (1) selecting the rank of the state in tomography and (2) selecting the model for the fidelity decay curve in randomized benchmarking.Comment: For a summary, see http://i.imgur.com/nMJxANo.pn

    High posterior density ellipsoids of quantum states

    Full text link
    Regions of quantum states generalize the classical notion of error bars. High posterior density (HPD) credible regions are the most powerful of region estimators. However, they are intractably hard to construct in general. This paper reports on a numerical approximation to HPD regions for the purpose of testing a much more computationally and conceptually convenient class of regions: posterior covariance ellipsoids (PCEs). The PCEs are defined via the covariance matrix of the posterior probability distribution of states. Here it is shown that PCEs are near optimal for the example of Pauli measurements on multiple qubits. Moreover, the algorithm is capable of producing accurate PCE regions even when there is uncertainty in the model.Comment: TL;DR version: computationally feasible region estimator

    Weak value amplification is suboptimal for estimation and detection

    Full text link
    We show using statistically rigorous arguments that the technique of weak value amplification (WVA) does not perform better than standard statistical techniques for the tasks of single parameter estimation and signal detection. Specifically we prove that post-selection, a necessary ingredient for WVA, decreases estimation accuracy and, moreover, arranging for anomalously large weak values is a suboptimal strategy. In doing so, we explicitly provide the optimal estimator, which in turn allows us to identify the optimal experimental arrangement to be the one in which all outcomes have equal weak values (all as small as possible) and the initial state of the meter is the maximal eigenvalue of the square of the system observable. Finally, we give precise quantitative conditions for when weak measurement (measurements without post-selection or anomalously large weak values) can mitigate the effect of uncharacterized technical noise in estimation.Comment: This is a significant revision which is closer to the published versio

    How the result of a single coin toss can turn out to be 100 heads

    Full text link
    We show that the phenomenon of anomalous weak values is not limited to quantum theory. In particular, we show that the same features occur in a simple model of a coin subject to a form of classical backaction with pre- and post-selection. This provides evidence that weak values are not inherently quantum, but rather a purely statistical feature of pre- and post-selection with disturbance.Comment: published versio

    Accelerated Randomized Benchmarking

    Full text link
    Quantum information processing offers promising advances for a wide range of fields and applications, provided that we can efficiently assess the performance of the control applied in candidate systems. That is, we must be able to determine whether we have implemented a desired gate, and refine accordingly. Randomized benchmarking reduces the difficulty of this task by exploiting symmetries in quantum operations. Here, we bound the resources required for benchmarking and show that, with prior information, we can achieve several orders of magnitude better accuracy than in traditional approaches to benchmarking. Moreover, by building on state-of-the-art classical algorithms, we reach these accuracies with near-optimal resources. Our approach requires an order of magnitude less data to achieve the same accuracies and to provide online estimates of the errors in the reported fidelities. We also show that our approach is useful for physical devices by comparing to simulations. Our results thus enable the application of randomized benchmarking in new regimes, and dramatically reduce the experimental effort required to assess control fidelities in quantum systems. Finally, our work is based on open-source scientific libraries, and can readily be applied in systems of interest.Comment: 10 pages, full source code at https://github.com/cgranade/accelerated-randomized-benchmarking #quantuminfo #benchmarkin
    • …
    corecore