2,734 research outputs found

    Quantifying dependencies for sensitivity analysis with multivariate input sample data

    Get PDF
    We present a novel method for quantifying dependencies in multivariate datasets, based on estimating the R\'{e}nyi entropy by minimum spanning trees (MSTs). The length of the MSTs can be used to order pairs of variables from strongly to weakly dependent, making it a useful tool for sensitivity analysis with dependent input variables. It is well-suited for cases where the input distribution is unknown and only a sample of the inputs is available. We introduce an estimator to quantify dependency based on the MST length, and investigate its properties with several numerical examples. To reduce the computational cost of constructing the exact MST for large datasets, we explore methods to compute approximations to the exact MST, and find the multilevel approach introduced recently by Zhong et al. (2015) to be the most accurate. We apply our proposed method to an artificial testcase based on the Ishigami function, as well as to a real-world testcase involving sediment transport in the North Sea. The results are consistent with prior knowledge and heuristic understanding, as well as with variance-based analysis using Sobol indices in the case where these indices can be computed

    Compression of quantum measurement operations

    Full text link
    We generalize recent work of Massar and Popescu dealing with the amount of classical data that is produced by a quantum measurement on a quantum state ensemble. In the previous work it was shown how spurious randomness generally contained in the outcomes can be eliminated without decreasing the amount of knowledge, to achieve an amount of data equal to the von Neumann entropy of the ensemble. Here we extend this result by giving a more refined description of what constitute equivalent measurements (that is measurements which provide the same knowledge about the quantum state) and also by considering incomplete measurements. In particular we show that one can always associate to a POVM with elements a_j, an equivalent POVM acting on many independent copies of the system which produces an amount of data asymptotically equal to the entropy defect of an ensemble canonically associated to the ensemble average state and the initial measurement (a_j). In the case where the measurement is not maximally refined this amount of data is strictly less than the von Neumann entropy, as obtained in the previous work. We also show that this is the best achievable, i.e. it is impossible to devise a measurement equivalent to the initial measurement (a_j) that produces less data. We discuss the interpretation of these results. In particular we show how they can be used to provide a precise and model independent measure of the amount of knowledge that is obtained about a quantum state by a quantum measurement. We also discuss in detail the relation between our results and Holevo's bound, at the same time providing a new proof of this fundamental inequality.Comment: RevTeX, 13 page

    Large deviations of cascade processes on graphs

    Full text link
    Simple models of irreversible dynamical processes such as Bootstrap Percolation have been successfully applied to describe cascade processes in a large variety of different contexts. However, the problem of analyzing non-typical trajectories, which can be crucial for the understanding of the out-of-equilibrium phenomena, is still considered to be intractable in most cases. Here we introduce an efficient method to find and analyze optimized trajectories of cascade processes. We show that for a wide class of irreversible dynamical rules, this problem can be solved efficiently on large-scale systems

    Entropy-power uncertainty relations : towards a tight inequality for all Gaussian pure states

    Full text link
    We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate continuous variables relies on entropy power, a standard notion in Shannon information theory for real-valued signals. The resulting entropy-power uncertainty relation is equivalent to the entropic formulation of the uncertainty relation due to Bialynicki-Birula and Mycielski, but can be further extended to rotated variables. Hence, based on a reasonable assumption, we give a partial proof of a tighter form of the entropy-power uncertainty relation taking correlations into account and provide extensive numerical evidence of its validity. Interestingly, it implies the generalized (rotation-invariant) Schr\"odinger-Robertson uncertainty relation exactly as the original entropy-power uncertainty relation implies Heisenberg relation. It is saturated for all Gaussian pure states, in contrast with hitherto known entropic formulations of the uncertainty principle.Comment: 15 pages, 5 figures, the new version includes the n-mode cas
    • …
    corecore