239 research outputs found

    Postquantum Br\`{e}gman relative entropies and nonlinear resource theories

    Full text link
    We introduce the family of postquantum Br\`{e}gman relative entropies, based on nonlinear embeddings into reflexive Banach spaces (with examples given by reflexive noncommutative Orlicz spaces over semi-finite W*-algebras, nonassociative Lp_p spaces over semi-finite JBW-algebras, and noncommutative Lp_p spaces over arbitrary W*-algebras). This allows us to define a class of geometric categories for nonlinear postquantum inference theory (providing an extension of Chencov's approach to foundations of statistical inference), with constrained maximisations of Br\`{e}gman relative entropies as morphisms and nonlinear images of closed convex sets as objects. Further generalisation to a framework for nonlinear convex operational theories is developed using a larger class of morphisms, determined by Br\`{e}gman nonexpansive operations (which provide a well-behaved family of Mielnik's nonlinear transmitters). As an application, we derive a range of nonlinear postquantum resource theories determined in terms of this class of operations.Comment: v2: several corrections and improvements, including an extension to the postquantum (generally) and JBW-algebraic (specifically) cases, a section on nonlinear resource theories, and more informative paper's titl

    Scalable Hash-Based Estimation of Divergence Measures

    Full text link
    We propose a scalable divergence estimation method based on hashing. Consider two continuous random variables XX and YY whose densities have bounded support. We consider a particular locality sensitive random hashing, and consider the ratio of samples in each hash bin having non-zero numbers of Y samples. We prove that the weighted average of these ratios over all of the hash bins converges to f-divergences between the two samples sets. We show that the proposed estimator is optimal in terms of both MSE rate and computational complexity. We derive the MSE rates for two families of smooth functions; the H\"{o}lder smoothness class and differentiable functions. In particular, it is proved that if the density functions have bounded derivatives up to the order d/2d/2, where dd is the dimension of samples, the optimal parametric MSE rate of O(1/N)O(1/N) can be achieved. The computational complexity is shown to be O(N)O(N), which is optimal. To the best of our knowledge, this is the first empirical divergence estimator that has optimal computational complexity and achieves the optimal parametric MSE estimation rate.Comment: 11 pages, Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS) 2018, Lanzarote, Spai
    • …
    corecore