2,814 research outputs found

    The best Fisher is upstream: data processing inequalities for quantum metrology

    Full text link
    We apply the classical data processing inequality to quantum metrology to show that manipulating the classical information from a quantum measurement cannot aid in the estimation of parameters encoded in quantum states. We further derive a quantum data processing inequality to show that coherent manipulation of quantum data also cannot improve the precision in estimation. In addition, we comment on the assumptions necessary to arrive at these inequalities and how they might be avoided providing insights into enhancement procedures which are not provably wrong.Comment: Comments encourage

    Self-guided quantum tomography

    Full text link
    We introduce a self-learning tomographic technique in which the experiment guides itself to an estimate of its own state. Self-guided quantum tomography (SGQT) uses measurements to directly test hypotheses in an iterative algorithm which converges to the true state. We demonstrate through simulation on many qubits that SGQT is a more efficient and robust alternative to the usual paradigm of taking a large amount of informationally complete data and solving the inverse problem of post-processed state estimation.Comment: v2: published versio

    Quantum Model Averaging

    Full text link
    Standard tomographic analyses ignore model uncertainty. It is assumed that a given model generated the data and the task is to estimate the quantum state, or a subset of parameters within that model. Here we apply a model averaging technique to mitigate the risk of overconfident estimates of model parameters in two examples: (1) selecting the rank of the state in tomography and (2) selecting the model for the fidelity decay curve in randomized benchmarking.Comment: For a summary, see http://i.imgur.com/nMJxANo.pn

    High posterior density ellipsoids of quantum states

    Full text link
    Regions of quantum states generalize the classical notion of error bars. High posterior density (HPD) credible regions are the most powerful of region estimators. However, they are intractably hard to construct in general. This paper reports on a numerical approximation to HPD regions for the purpose of testing a much more computationally and conceptually convenient class of regions: posterior covariance ellipsoids (PCEs). The PCEs are defined via the covariance matrix of the posterior probability distribution of states. Here it is shown that PCEs are near optimal for the example of Pauli measurements on multiple qubits. Moreover, the algorithm is capable of producing accurate PCE regions even when there is uncertainty in the model.Comment: TL;DR version: computationally feasible region estimator

    Editor\u27s Column

    Get PDF
    As the editor of the Journal I find it challenging to oversee publication in many different areas of psychiatry. From cognitive therapy to consultation for the treatment of burned children I know, or quickly learn the more intricate details of the field. In compiling this issue, I was particularly struck by the number of articles focusing on Child and Adolescent Psychiatry, an area of our field to which I have had limited exposure in my first two years of training

    Weak value amplification is suboptimal for estimation and detection

    Full text link
    We show using statistically rigorous arguments that the technique of weak value amplification (WVA) does not perform better than standard statistical techniques for the tasks of single parameter estimation and signal detection. Specifically we prove that post-selection, a necessary ingredient for WVA, decreases estimation accuracy and, moreover, arranging for anomalously large weak values is a suboptimal strategy. In doing so, we explicitly provide the optimal estimator, which in turn allows us to identify the optimal experimental arrangement to be the one in which all outcomes have equal weak values (all as small as possible) and the initial state of the meter is the maximal eigenvalue of the square of the system observable. Finally, we give precise quantitative conditions for when weak measurement (measurements without post-selection or anomalously large weak values) can mitigate the effect of uncharacterized technical noise in estimation.Comment: This is a significant revision which is closer to the published versio

    The Poor and the Dead: Socioeconomic Status and Mortality in the U.S., 1850-1860

    Get PDF
    Despite the significant research on aggregate trends in mortality and physical stature in the middle of the nineteenth century, little evidence on the individual-level characteristics associated with premature mortality has been presented. This essay describes a new project that links individuals from the mortality schedules to the population schedules of the 1850 and 1860 federal population censuses. This makes it possible to assess the link between individual and household characteristics and the probability of dying. The results reveal a strong and negative relationship between household wealth and mortality in 1850 and 1860 and a somewhat weaker negative relationship between occupational status and mortality in 1850. The findings suggest that even when the U.S. population was largely rural and agricultural, changes in the distribution of income and wealth would have had a large impact on mortality rates and life expectancies. Urbanization merely exacerbated already existing disparities in mortality by socioeconomic status.

    How the result of a single coin toss can turn out to be 100 heads

    Full text link
    We show that the phenomenon of anomalous weak values is not limited to quantum theory. In particular, we show that the same features occur in a simple model of a coin subject to a form of classical backaction with pre- and post-selection. This provides evidence that weak values are not inherently quantum, but rather a purely statistical feature of pre- and post-selection with disturbance.Comment: published versio
    corecore