2 research outputs found

    Joint mixability of some integer matrices

    Get PDF
    We study the problem of permuting each column of a given matrix to achieve minimum maximal row sum or maximum minimal row sum, a problem of interest in probability theory and quantitative finance where quantiles of a random variable expressed as the sum of several random variables with unknown dependence structure are estimated. If the minimum maximal row sum is equal to the maximum minimal row sum the matrix has been termed jointly mixable (see e.g. Haus (2015), Wang and Wang (2015), Wang et al. (2013)). We show that the lack of joint mixability (the joint mixability gap) is not significant, i.e., the gap between the minimum maximal row sum and the maximum minimal row sum is either zero or one for a class of integer matrices including binary and complete consecutive integers matrices. For integer matrices where all entries are drawn from a given set of discrete values, we show that the gap can be as large as the difference between the maximal and minimal elements of the discrete set. The aforementioned result also leads to a polynomial-time approximation algorithm for matrices with restricted domain. Computing the gap for a {0,1,2}-matrix is proved to be equivalent to finding column permutations minimizing the difference between the maximum and minimum row sums. A polynomial procedure for computing the optimum difference by solving the maximum flow problem on an appropriate graph is given. © 2016 Elsevier B.V. All rights reserved

    Convolution Bounds on Quantile Aggregation

    Full text link
    Quantile aggregation with dependence uncertainty has a long history in probability theory with wide applications in finance, risk management, statistics, and operations research. Using a recent result on inf-convolution of quantile-based risk measures, we establish new analytical bounds for quantile aggregation which we call convolution bounds. In fact, convolution bounds unify every analytical result and contribute more to the theory of quantile aggregation, and thus these bounds are genuinely the best one available. Moreover, convolution bounds are easy to compute, and we show that they are sharp in many relevant cases. Convolution bounds enjoy several other advantages, including interpretability on the extremal dependence structure, tractability, and theoretical properties. The results directly lead to bounds on the distribution of the sum of random variables with arbitrary dependence, and we illustrate a few applications in operations research
    corecore