50 research outputs found

    Quantum concentration inequalities

    Full text link
    We establish transportation cost inequalities (TCI) with respect to the quantum Wasserstein distance by introducing quantum extensions of well-known classical methods: first, using a non-commutative version of Ollivier's coarse Ricci curvature, we prove that high temperature Gibbs states of commuting Hamiltonians on arbitrary hypergraphs H=(V,E)H=(V,E) satisfy a TCI with constant scaling as O(V)O(|V|). Second, we argue that the temperature range for which the TCI holds can be enlarged by relating it to recently established modified logarithmic Sobolev inequalities. Third, we prove that the inequality still holds for fixed points of arbitrary reversible local quantum Markov semigroups on regular lattices, albeit with slightly worsened constants, under a seemingly weaker condition of local indistinguishability of the fixed points. Finally, we use our framework to prove Gaussian concentration bounds for the distribution of eigenvalues of quasi-local observables and argue the usefulness of the TCI in proving the equivalence of the canonical and microcanonical ensembles and an exponential improvement over the weak Eigenstate Thermalization Hypothesis.Comment: 31 pages, one figur

    The Sphere Packing Bound via Augustin's Method

    Full text link
    A sphere packing bound (SPB) with a prefactor that is polynomial in the block length nn is established for codes on a length nn product channel W[1,n]W_{[1,n]} assuming that the maximum order 1/21/2 Renyi capacity among the component channels, i.e. maxt[1,n]C1/2,Wt\max_{t\in[1,n]} C_{1/2,W_{t}}, is O(lnn)\mathit{O}(\ln n). The reliability function of the discrete stationary product channels with feedback is bounded from above by the sphere packing exponent. Both results are proved by first establishing a non-asymptotic SPB. The latter result continues to hold under a milder stationarity hypothesis.Comment: 30 pages. An error in the statement of Lemma 2 is corrected. The change is inconsequential for the rest of the pape

    A generalized risk approach to path inference based on hidden Markov models

    Full text link
    Motivated by the unceasing interest in hidden Markov models (HMMs), this paper re-examines hidden path inference in these models, using primarily a risk-based framework. While the most common maximum a posteriori (MAP), or Viterbi, path estimator and the minimum error, or Posterior Decoder (PD), have long been around, other path estimators, or decoders, have been either only hinted at or applied more recently and in dedicated applications generally unfamiliar to the statistical learning community. Over a decade ago, however, a family of algorithmically defined decoders aiming to hybridize the two standard ones was proposed (Brushe et al., 1998). The present paper gives a careful analysis of this hybridization approach, identifies several problems and issues with it and other previously proposed approaches, and proposes practical resolutions of those. Furthermore, simple modifications of the classical criteria for hidden path recognition are shown to lead to a new class of decoders. Dynamic programming algorithms to compute these decoders in the usual forward-backward manner are presented. A particularly interesting subclass of such estimators can be also viewed as hybrids of the MAP and PD estimators. Similar to previously proposed MAP-PD hybrids, the new class is parameterized by a small number of tunable parameters. Unlike their algorithmic predecessors, the new risk-based decoders are more clearly interpretable, and, most importantly, work "out of the box" in practice, which is demonstrated on some real bioinformatics tasks and data. Some further generalizations and applications are discussed in conclusion.Comment: Section 5: corrected denominators of the scaled beta variables (pp. 27-30), => corrections in claims 1, 3, Prop. 12, bottom of Table 1. Decoder (49), Corol. 14 are generalized to handle 0 probabilities. Notation is more closely aligned with (Bishop, 2006). Details are inserted in eqn-s (43); the positivity assumption in Prop. 11 is explicit. Fixed typing errors in equation (41), Example

    Essays in international finance

    Get PDF
    This Ph.D. thesis contains 3 essays in international finance with a focus on foreign exchange market from the perspectives of empirical asset pricing (Chapter 2 and Chapter 3), forecasting and market microstructure (Chapter 4). In Chapter 2, I derive the position-unwinding likelihood indicator for currency carry trade portfolios in the option pricing model, and show that it represents the systematic crash risk associated with global liquidity imbalances and also is able to price the cross-section of global currency, sovereign bond, and equity portfolios; I also explore the currency option-implied sovereign default risk in Merton’s framework, and link the sovereign CDS-implied credit risk premia to currency excess returns that it prices the cross section of currency carry, momentum, and volatility risk premium portfolios. In Chapter 3, I investigate the factor structure in currency market and identify three important properties of global currencies – overvalued (undervalued) currencies with respect to equilibrium exchange rates tend to be crash sensitive (insensitive) measured by copula lower tail dependence, relatively cheap (expensive) to hedge in terms of volatility risk premium, and exposed to high (low) speculative propensity gauged by skew risk premium. I further reveal that these three characteristics have rich asset pricing and asset allocation implications, e.g. striking crash-neutral and diversification benefits for portfolio optimization and risk management purposes. In Chapter 4, I examine the term structure of exchange rate predictability by return decomposition, incorporate common latent factors across a range of investment horizons into the exchange rate dynamics with a broad set of predictors, and handle both parameter uncertainty and model uncertainty. I demonstrate the time-varying term-structural effect and model disagreement effect of exchange rate determinants and the projections of predictive information over the term structure, and utilize the time-variation in the probability weighting from dynamic model averaging to identify the scapegoat drivers of customer order flows. I further comprehensively evaluate both statistical and economic significance of the model allowing for a full spectrum of currency investment management, and find that the model generates substantial performance fees

    Quantum Concentration Inequalities

    Get PDF
    We establish Transportation Cost Inequalities (TCIs) with respect to the quantum Wasserstein distance by introducing quantum extensions of well-known classical methods: First, we generalize the Dobrushin uniqueness condition to prove that Gibbs states of 1D commuting Hamiltonians satisfy a TCI at any positive temperature and provide conditions under which this first result can be extended to non-commuting Hamiltonians. Next, using a non-commutative version of Ollivier’s coarse Ricci curvature, we prove that high temperature Gibbs states of commuting Hamiltonians on arbitrary hypergraphs H= (V, E) satisfy a TCI with constant scaling as O(|V|). Third, we argue that the temperature range for which the TCI holds can be enlarged by relating it to recently established modified logarithmic Sobolev inequalities. Fourth, we prove that the inequality still holds for fixed points of arbitrary reversible local quantum Markov semigroups on regular lattices, albeit with slightly worsened constants, under a seemingly weaker condition of local indistinguishability of the fixed points. Finally, we use our framework to prove Gaussian concentration bounds for the distribution of eigenvalues of quasi-local observables and argue the usefulness of the TCI in proving the equivalence of the canonical and microcanonical ensembles and an exponential improvement over the weak Eigenstate Thermalization Hypothesis

    Information Theoretic Resources in Quantum Theory

    Full text link
    Resource identification and quantification is an essential element of both classical and quantum information theory. Entanglement is one of these resources, arising when quantum communication and nonlocal operations are expensive to perform. In the first part of this thesis we quantify the effective entanglement when operations are additionally restricted. For an important class of errors we find a linear relationship between the usual and effective higher dimensional generalization of concurrence, a measure of entanglement. In the second chapter we focus on nonlocality in the presence of superselection rules, where we propose a scheme that may be used to activate nongenuinely multipartite nonlocality with multiple copies of the state. We show that whenever the number of particles is insufficient, the genuinely multipartite nonlocality is degraded to nongenuinely multipartite. While in the first few chapters we focus on understanding the resources present in quantum states, in the final part we turn the picture around and instead treat operations themselves as a resource. We provide our observers with free access to classical operations - ie. those that cannot detect or generate quantum coherence. We show that the operation of interest can then be used to either generate or detect quantum coherence if and only if it violates a particular commutation relation. Using the relative entropy, the commutation relation provides us with a measure of nonclassicality of operations. We show that the measure is a sum of two contributions, the generating power and the distinguishing power, each of which is separately an essential ingredient in quantum communication and information processing. The measure also sheds light on the operational meaning of quantum discord, which we show can be interpreted as the difference in superdense coding capacity between a quantum state and a classical state.Comment: Thesis, 109 page
    corecore