206 research outputs found
State Information in Bayesian Games
Two-player zero-sum repeated games are well understood. Computing the value
of such a game is straightforward. Additionally, if the payoffs are dependent
on a random state of the game known to one, both, or neither of the players,
the resulting value of the game has been analyzed under the framework of
Bayesian games. This investigation considers the optimal performance in a game
when a helper is transmitting state information to one of the players.
Encoding information for an adversarial setting (game) requires a different
result than rate-distortion theory provides. Game theory has accentuated the
importance of randomization (mixed strategy), which does not find a significant
role in most communication modems and source coding codecs. Higher rates of
communication, used in the right way, allow the message to include the
necessary random component useful in games.Comment: Presented at Allerton 2009, 6 pages, 5 eps figures, uses IEEEtran.cl
Distributed Channel Synthesis
Two familiar notions of correlation are rediscovered as the extreme operating
points for distributed synthesis of a discrete memoryless channel, in which a
stochastic channel output is generated based on a compressed description of the
channel input. Wyner's common information is the minimum description rate
needed. However, when common randomness independent of the input is available,
the necessary description rate reduces to Shannon's mutual information. This
work characterizes the optimal trade-off between the amount of common
randomness used and the required rate of description. We also include a number
of related derivations, including the effect of limited local randomness, rate
requirements for secrecy, applications to game theory, and new insights into
common information duality.
Our proof makes use of a soft covering lemma, known in the literature for its
role in quantifying the resolvability of a channel. The direct proof
(achievability) constructs a feasible joint distribution over all parts of the
system using a soft covering, from which the behavior of the encoder and
decoder is inferred, with no explicit reference to joint typicality or binning.
Of auxiliary interest, this work also generalizes and strengthens this soft
covering tool.Comment: To appear in IEEE Trans. on Information Theory (submitted Aug., 2012,
accepted July, 2013), 26 pages, using IEEEtran.cl
A Stronger Soft-Covering Lemma and Applications
Wyner's soft-covering lemma is a valuable tool for achievability proofs of
information theoretic security, resolvability, channel synthesis, and source
coding. The result herein sharpens the claim of soft-covering by moving away
from an expected value analysis. Instead, a random codebook is shown to achieve
the soft-covering phenomenon with high probability. The probability of failure
is doubly-exponentially small in the block-length, enabling more powerful
applications through the union bound.Comment: IEEE CNS 2015, 2nd Workshop on Physical-layer Methods for Wireless
Security, 4 page
Optimal Equivocation in Secrecy Systems a Special Case of Distortion-based Characterization
Recent work characterizing the optimal performance of secrecy systems has
made use of a distortion-like metric for partial secrecy as a replacement for
the more traditional metric of equivocation. In this work we use the log-loss
function to show that the optimal performance limits characterized by
equivocation are, in fact, special cases of distortion-based counterparts. This
observation illuminates why equivocation doesn't tell the whole story of
secrecy. It also justifies the causal-disclosure framework for secrecy (past
source symbols and actions revealed to the eavesdropper).Comment: Invited to ITA 2013, 3 pages, no figures, using IEEEtran.cl
Rate-Distortion Theory for Secrecy Systems
Secrecy in communication systems is measured herein by the distortion that an
adversary incurs. The transmitter and receiver share secret key, which they use
to encrypt communication and ensure distortion at an adversary. A model is
considered in which an adversary not only intercepts the communication from the
transmitter to the receiver, but also potentially has side information.
Specifically, the adversary may have causal or noncausal access to a signal
that is correlated with the source sequence or the receiver's reconstruction
sequence. The main contribution is the characterization of the optimal tradeoff
among communication rate, secret key rate, distortion at the adversary, and
distortion at the legitimate receiver. It is demonstrated that causal side
information at the adversary plays a pivotal role in this tradeoff. It is also
shown that measures of secrecy based on normalized equivocation are a special
case of the framework.Comment: Update version, to appear in IEEE Transactions on Information Theor
Gaussian Secure Source Coding and Wyner's Common Information
We study secure source-coding with causal disclosure, under the Gaussian
distribution. The optimality of Gaussian auxiliary random variables is shown in
various scenarios. We explicitly characterize the tradeoff between the rates of
communication and secret key. This tradeoff is the result of a mutual
information optimization under Markov constraints. As a corollary, we deduce a
general formula for Wyner's Common Information in the Gaussian setting.Comment: ISIT 2015, 5 pages, uses IEEEtran.cl
Secure Cascade Channel Synthesis
We consider the problem of generating correlated random variables in a
distributed fashion, where communication is constrained to a cascade network.
The first node in the cascade observes an i.i.d. sequence locally before
initiating communication along the cascade. All nodes share bits of common
randomness that are independent of . We consider secure synthesis - random
variables produced by the system appear to be appropriately correlated and
i.i.d. even to an eavesdropper who is cognizant of the communication
transmissions. We characterize the optimal tradeoff between the amount of
common randomness used and the required rates of communication. We find that
not only does common randomness help, its usage exceeds the communication rate
requirements. The most efficient scheme is based on a superposition codebook,
with the first node selecting messages for all downstream nodes. We also
provide a fleeting view of related problems, demonstrating how the optimal rate
region may shrink or expand.Comment: Submitted to IEEE Transactions on Information Theor
- …