216 research outputs found

    Joint Mixability of Elliptical Distributions and Related Families

    Full text link
    In this paper, we further develop the theory of complete mixability and joint mixability for some distribution families. We generalize a result of R\"uschendorf and Uckelmann (2002) related to complete mixability of continuous distribution function having a symmetric and unimodal density. Two different proofs to a result of Wang and Wang (2016) which related to the joint mixability of elliptical distributions with the same characteristic generator are present. We solve the Open Problem 7 in Wang (2015) by constructing a bimodal-symmetric distribution. The joint mixability of slash-elliptical distributions and skew-elliptical distributions is studied and the extension to multivariate distributions is also investigated.Comment: 15page

    Bounding Stochastic Dependence, Complete Mixability of Matrices, and Multidimensional Bottleneck Assignment Problems

    Full text link
    We call a matrix completely mixable if the entries in its columns can be permuted so that all row sums are equal. If it is not completely mixable, we want to determine the smallest maximal and largest minimal row sum attainable. These values provide a discrete approximation of of minimum variance problems for discrete distributions, a problem motivated by the question how to estimate the α\alpha-quantile of an aggregate random variable with unknown dependence structure given the marginals of the constituent random variables. We relate this problem to the multidimensional bottleneck assignment problem and show that there exists a polynomial 22-approximation algorithm if the matrix has only 33 columns. In general, deciding complete mixability is NP\mathcal{NP}-complete. In particular the swapping algorithm of Puccetti et al. is not an exact method unless NP⊆ZPP\mathcal{NP}\subseteq\mathcal{ZPP}. For a fixed number of columns it remains NP\mathcal{NP}-complete, but there exists a PTAS. The problem can be solved in pseudopolynomial time for a fixed number of rows, and even in polynomial time if all columns furthermore contain entries from the same multiset

    Fast rates in statistical and online learning

    Get PDF
    The speed with which a learning algorithm converges as it is presented with more data is a central problem in machine learning --- a fast rate of convergence means less data is needed for the same level of performance. The pursuit of fast rates in online and statistical learning has led to the discovery of many conditions in learning theory under which fast learning is possible. We show that most of these conditions are special cases of a single, unifying condition, that comes in two forms: the central condition for 'proper' learning algorithms that always output a hypothesis in the given model, and stochastic mixability for online algorithms that may make predictions outside of the model. We show that under surprisingly weak assumptions both conditions are, in a certain sense, equivalent. The central condition has a re-interpretation in terms of convexity of a set of pseudoprobabilities, linking it to density estimation under misspecification. For bounded losses, we show how the central condition enables a direct proof of fast rates and we prove its equivalence to the Bernstein condition, itself a generalization of the Tsybakov margin condition, both of which have played a central role in obtaining fast rates in statistical learning. Yet, while the Bernstein condition is two-sided, the central condition is one-sided, making it more suitable to deal with unbounded losses. In its stochastic mixability form, our condition generalizes both a stochastic exp-concavity condition identified by Juditsky, Rigollet and Tsybakov and Vovk's notion of mixability. Our unifying conditions thus provide a substantial step towards a characterization of fast rates in statistical learning, similar to how classical mixability characterizes constant regret in the sequential prediction with expert advice setting.Comment: 69 pages, 3 figure

    Current Open Questions in Complete Mixability

    Full text link
    Complete and joint mixability has raised considerable interest in recent few years, in both the theory of distributions with given margins, and applications in discrete optimization and quantitative risk management. We list various open questions in the theory of complete and joint mixability, which are mathematically concrete, and yet accessible to a broad range of researchers without specific background knowledge. In addition to the discussions on open questions, some results contained in this paper are new
    • …
    corecore