140 research outputs found
Optimal Deterministic Polynomial-Time Data Exchange for Omniscience
We study the problem of constructing a deterministic polynomial time
algorithm that achieves omniscience, in a rate-optimal manner, among a set of
users that are interested in a common file but each has only partial knowledge
about it as side-information. Assuming that the collective information among
all the users is sufficient to allow the reconstruction of the entire file, the
goal is to minimize the (possibly weighted) amount of bits that these users
need to exchange over a noiseless public channel in order for all of them to
learn the entire file. Using established connections to the multi-terminal
secrecy problem, our algorithm also implies a polynomial-time method for
constructing a maximum size secret shared key in the presence of an
eavesdropper. We consider the following types of side-information settings: (i)
side information in the form of uncoded fragments/packets of the file, where
the users' side-information consists of subsets of the file; (ii) side
information in the form of linearly correlated packets, where the users have
access to linear combinations of the file packets; and (iii) the general
setting where the the users' side-information has an arbitrary (i.i.d.)
correlation structure. Building on results from combinatorial optimization, we
provide a polynomial-time algorithm (in the number of users) that, first finds
the optimal rate allocations among these users, then determines an explicit
transmission scheme (i.e., a description of which user should transmit what
information) for cases (i) and (ii)
Cooperative Data Exchange based on MDS Codes
The cooperative data exchange problem is studied for the fully connected
network. In this problem, each node initially only possesses a subset of the
packets making up the file. Nodes make broadcast transmissions that are
received by all other nodes. The goal is for each node to recover the full
file. In this paper, we present a polynomial-time deterministic algorithm to
compute the optimal (i.e., minimal) number of required broadcast transmissions
and to determine the precise transmissions to be made by the nodes. A
particular feature of our approach is that {\it each} of the
transmissions is a linear combination of {\it exactly} packets, and we
show how to optimally choose the value of We also show how the
coefficients of these linear combinations can be chosen by leveraging a
connection to Maximum Distance Separable (MDS) codes. Moreover, we show that
our method can be used to solve cooperative data exchange problems with
weighted cost as well as the so-called successive local omniscience problem.Comment: 21 pages, 1 figur
Coded Cooperative Data Exchange for a Secret Key
We consider a coded cooperative data exchange problem with the goal of
generating a secret key. Specifically, we investigate the number of public
transmissions required for a set of clients to agree on a secret key with
probability one, subject to the constraint that it remains private from an
eavesdropper.
Although the problems are closely related, we prove that secret key
generation with fewest number of linear transmissions is NP-hard, while it is
known that the analogous problem in traditional cooperative data exchange can
be solved in polynomial time. In doing this, we completely characterize the
best possible performance of linear coding schemes, and also prove that linear
codes can be strictly suboptimal. Finally, we extend the single-key results to
characterize the minimum number of public transmissions required to generate a
desired integer number of statistically independent secret keys.Comment: Full version of a paper that appeared at ISIT 2014. 19 pages, 2
figure
A Practical Approach for Successive Omniscience
The system that we study in this paper contains a set of users that observe a
discrete memoryless multiple source and communicate via noise-free channels
with the aim of attaining omniscience, the state that all users recover the
entire multiple source. We adopt the concept of successive omniscience (SO),
i.e., letting the local omniscience in some user subset be attained before the
global omniscience in the entire system, and consider the problem of how to
efficiently attain omniscience in a successive manner. Based on the existing
results on SO, we propose a CompSetSO algorithm for determining a complimentary
set, a user subset in which the local omniscience can be attained first without
increasing the sum-rate, the total number of communications, for the global
omniscience. We also derive a sufficient condition for a user subset to be
complimentary so that running the CompSetSO algorithm only requires a lower
bound, instead of the exact value, of the minimum sum-rate for attaining global
omniscience. The CompSetSO algorithm returns a complimentary user subset in
polynomial time. We show by example how to recursively apply the CompSetSO
algorithm so that the global omniscience can be attained by multi-stages of SO
On the Optimality of Secret Key Agreement via Omniscience
For the multiterminal secret key agreement problem under a private source
model, it is known that the maximum key rate, i.e., the secrecy capacity, can
be achieved through communication for omniscience, but the omniscience strategy
can be strictly suboptimal in terms of minimizing the public discussion rate.
While a single-letter characterization is not known for the minimum discussion
rate needed for achieving the secrecy capacity, we derive single-letter lower
and upper bounds that yield some simple conditions for omniscience to be
discussion-rate optimal. These conditions turn out to be enough to deduce the
optimality of omniscience for a large class of sources including the
hypergraphical sources. Through conjectures and examples, we explore other
source models to which our methods do not easily extend
A Monetary Mechanism for Stabilizing Cooperative Data Exchange with Selfish Users
In this research, we address the stability issues in Cooperative Data Exchange (CDE),
one of the central problems in wireless network coding. We consider a setting in which the
users are selfish, i.e., would like to maximize their own utility. More specifically, we
consider a setting where each user has a subset of packets in the ground set X, and wants
all other packets in X. The users can exchange data by broadcasting coded or uncoded
packets over a lossless channel, and monetary transactions are allowed between any pair
of users. We define the utility of each user as the sum of two sub-utility functions: (i)
the difference between the total payment received by the user and the total transmission
rate of the user, and (ii) the difference between the total number of required packets by
the user and the total payment made by the user. A rate-vector and payment-matrix
pair (r, p) is said to stabilize the grand coalition (i.e., the set of all users) if (r, p) is Paretooptimal
over all minor coalitions (i.e., all proper subsets of users who collectively know
all packets in X). Our goal is to design algorithms that compute a stabilizing ratepayment
pair with minimum total sum-rate and minimum total sum-payment for any
given instance of the problem. In this work, we propose two algorithms that maximize the
sum of utility of all users (over all solutions), and one of the algorithms also maximizes
the minimum utility among all users (over all solutions). The second algorithm requires a
broker, where each user has to trust the broker and use the broker to exchange payments,
whereas in the first algorithm there is no such requirement. In the first algorithm, the users
directly compensate user broadcasting the packet in that particular round. Our scheme
minimizes the total number of transmitted packets, as well as the total amount of
payments. We also perform an extensive simulation study to evaluate the performance of
our scheme in practical setting
Why Philosophers Should Care About Computational Complexity
One might think that, once we know something is computable, how efficiently
it can be computed is a practical question with little further philosophical
importance. In this essay, I offer a detailed case that one would be wrong. In
particular, I argue that computational complexity theory---the field that
studies the resources (such as time, space, and randomness) needed to solve
computational problems---leads to new perspectives on the nature of
mathematical knowledge, the strong AI debate, computationalism, the problem of
logical omniscience, Hume's problem of induction, Goodman's grue riddle, the
foundations of quantum mechanics, economic rationality, closed timelike curves,
and several other topics of philosophical interest. I end by discussing aspects
of complexity theory itself that could benefit from philosophical analysis.Comment: 58 pages, to appear in "Computability: G\"odel, Turing, Church, and
beyond," MIT Press, 2012. Some minor clarifications and corrections; new
references adde
Efficient Algorithms for the Data Exchange Problem
In this paper we study the data exchange problem where a set of users is
interested in gaining access to a common file, but where each has only partial
knowledge about it as side-information. Assuming that the file is broken into
packets, the side-information considered is in the form of linear combinations
of the file packets. Given that the collective information of all the users is
sufficient to allow recovery of the entire file, the goal is for each user to
gain access to the file while minimizing some communication cost. We assume
that users can communicate over a noiseless broadcast channel, and that the
communication cost is a sum of each user's cost function over the number of
bits it transmits. For instance, the communication cost could simply be the
total number of bits that needs to be transmitted. In the most general case
studied in this paper, each user can have any arbitrary convex cost function.
We provide deterministic, polynomial-time algorithms (in the number of users
and packets) which find an optimal communication scheme that minimizes the
communication cost. To further lower the complexity, we also propose a simple
randomized algorithm inspired by our deterministic algorithm which is based on
a random linear network coding scheme.Comment: submitted to Transactions on Information Theor
- …