12 research outputs found
Information Theoretic cutting of a cake
Cutting a cake is a metaphor for the problem of dividing a resource (cake)
among several agents. The problem becomes non-trivial when the agents have
different valuations for different parts of the cake (i.e. one agent may like
chocolate while the other may like cream). A fair division of the cake is one
that takes into account the individual valuations of agents and partitions the
cake based on some fairness criterion. Fair division may be accomplished in a
distributed or centralized way. Due to its natural and practical appeal, it has
been a subject of study in economics. To best of our knowledge the role of
partial information in fair division has not been studied so far from an
information theoretic perspective. In this paper we study two important
algorithms in fair division, namely "divide and choose" and "adjusted winner"
for the case of two agents. We quantify the benefit of negotiation in the
divide and choose algorithm, and its use in tricking the adjusted winner
algorithm. Also we analyze the role of implicit information transmission
through actions for the repeated divide and choose problem by finding a
trembling hand perfect equilibrium for an specific setup. Lastly we consider a
centralized algorithm for maximizing the overall welfare of the agents under
the Nash collective utility function (CUF). This corresponds to a clustering
problem of the type traditionally studied in data mining and machine learning.
Drawing a conceptual link between this problem and the portfolio selection
problem in stock markets, we prove an upper bound on the increase of the Nash
CUF for a clustering refinement.Comment: Submitted to IEEE Transactions on Information Theor
Recommended from our members
Problems on Large Sparse Graphs
In this thesis, we study two category of problems involving large sparse graphs, namely the problem of compression for graphical data, and load balancing in networks. We achieve this by employing the framework of local weak convergence, or so called the objective method. This framework provides a viewpoint which enables one to make sense of a notion of stationary stochastic processes for sparse graphs.By employing the local weak convergence framework, we introduce a notion of entropy for probability distributions on rooted graphs. This is a generalization of the notion of entropy introduced by Bordenave and Caputo to graphs which carry marks on their vertices and edges. Such marks can represent information on real-world data. This notion of entropy can be considered as a natural counterpart for the Shannon entropy rate in the world of sparse graphical data. We illustrate this by introducing a universal compression scheme for sparse marked graphs. Furthermore, we study distributed compression of graphical data. In particular, we introduce a version of the Slepian-Wolf theorem for sparse marked graphs.In addition to studying the problem of compression, we study the problem of load balancing in networks. We do this by modeling the problem as a hypergraph where each hyperedge represents a task carrying one unit of load, and each vertex represents a server. An allocation is a way of distributing this load. we study balanced allocations, which are roughly speaking those allocations in which no demand desires to change its allocation. Employing an extension of the local weak convergence theory to hypergraphs, we study certain asymptotic behaviors of balanced allocations, such as the asymptotic empirical load distribution at a typical server, as well as the asymptotic of the maximum load.Problems studied in this thesis should be considered as examples showing the wide-range applicability of the local weak convergence theory and the above mentioned notion of entropy. In fact, this framework provides a viewpoint of stationary stochastic processes for sparse marked graphs. The theory of time series is the engine driving an enormous range of applications in areas such as control theory, communications, information theory and signal processing. It is to be expected that a theory of stationary stochastic processes for combinatorial structures, in particular graphs, would eventually have a similarly wide-ranging impact