5 research outputs found

    On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs

    Get PDF
    Suppose that you have nn colours and mm mutually independent dice, each of which has rr sides. Each dice lands on any of its sides with equal probability. You may colour the sides of each die in any way you wish, but there is one restriction: you are not allowed to use the same colour more than once on the sides of a die. Any other colouring is allowed. Let XX be the number of different colours that you see after rolling the dice. How should you colour the sides of the dice in order to maximize the Shannon entropy of XX? In this article we investigate this question. We show that the entropy of XX is at most 12log(n)+O(1)\frac{1}{2} \log(n) + O(1) and that the bound is tight, up to a constant additive factor, in the case of there being equally many coins and colours. Our proof employs the differential entropy bound on discrete entropy, along with a lower bound on the entropy of binomial random variables whose outcome is conditioned to be an even integer. We conjecture that the entropy is maximized when the colours are distributed over the sides of the dice as evenly as possible.Comment: 11 page

    On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs

    Get PDF
    Suppose that you have nn colours and mm  mutually independent dice, each of which has rr sides. Each dice lands on any of its sides with equal probability. You may colour the sides of each die in any way you wish, but there is one restriction: you are not allowed to use the same colour more than once on the sides of a die. Any other colouring is allowed. Let XX be the number of different colours that you see after rolling the dice. How should you colour the sides of the dice in order to maximize the Shannon entropy of XX? In this article we investigate this question. It is shown that the entropy of XX is at most 12log(n)+12log(πe)\frac{1}{2} \log(n) + \frac{1}{2}\log(\pi e) and that the bound is tight, up to a constant additive factor, in the case of there being equally many coins and colours. Our proof employs  the differential entropy bound on discrete entropy, along with a lower bound on the entropy of binomial random variables whose outcome is conditioned to be an even integer. We conjecture that the entropy is maximized when the colours are distributed  over the sides of the dice as evenly as possible

    An Optimal Algorithm for Strict Circular Seriation

    Get PDF
    We study the problem of circular seriation, where we are given a matrix of pairwise dissimilarities between nn objects, and the goal is to find a {\em circular order} of the objects in a manner that is consistent with their dissimilarity. This problem is a generalization of the classical {\em linear seriation} problem where the goal is to find a {\em linear order}, and for which optimal O(n2){\cal O}(n^2) algorithms are known. Our contributions can be summarized as follows. First, we introduce {\em circular Robinson matrices} as the natural class of dissimilarity matrices for the circular seriation problem. Second, for the case of {\em strict circular Robinson dissimilarity matrices} we provide an optimal O(n2){\cal O}(n^2) algorithm for the circular seriation problem. Finally, we propose a statistical model to analyze the well-posedness of the circular seriation problem for large nn. In particular, we establish O(log(n)/n){\cal O}(\log(n)/n) rates on the distance between any circular ordering found by solving the circular seriation problem to the underlying order of the model, in the Kendall-tau metric.Comment: 27 pages, 5 figure
    corecore