57 research outputs found
Secrecy Through Synchronization Errors
In this paper, we propose a transmission scheme that achieves information
theoretic security, without making assumptions on the eavesdropper's channel.
This is achieved by a transmitter that deliberately introduces synchronization
errors (insertions and/or deletions) based on a shared source of randomness.
The intended receiver, having access to the same shared source of randomness as
the transmitter, can resynchronize the received sequence. On the other hand,
the eavesdropper's channel remains a synchronization error channel. We prove a
secrecy capacity theorem, provide a lower bound on the secrecy capacity, and
propose numerical methods to evaluate it.Comment: 5 pages, 6 figures, submitted to ISIT 201
Strong Converse and Second-Order Asymptotics of Channel Resolvability
We study the problem of channel resolvability for fixed i.i.d. input
distributions and discrete memoryless channels (DMCs), and derive the strong
converse theorem for any DMCs that are not necessarily full rank. We also
derive the optimal second-order rate under a condition. Furthermore, under the
condition that a DMC has the unique capacity achieving input distribution, we
derive the optimal second-order rate of channel resolvability for the worst
input distribution.Comment: 7 pages, a shorter version will appear in ISIT 2014, this version
includes the proofs of technical lemmas in appendice
A Stronger Soft-Covering Lemma and Applications
Wyner's soft-covering lemma is a valuable tool for achievability proofs of
information theoretic security, resolvability, channel synthesis, and source
coding. The result herein sharpens the claim of soft-covering by moving away
from an expected value analysis. Instead, a random codebook is shown to achieve
the soft-covering phenomenon with high probability. The probability of failure
is doubly-exponentially small in the block-length, enabling more powerful
applications through the union bound.Comment: IEEE CNS 2015, 2nd Workshop on Physical-layer Methods for Wireless
Security, 4 page
MAC Resolvability: First And Second Order Results
Building upon previous work on the relation between secrecy and channel
resolvability, we revisit a secrecy proof for the multiple-access channel from
the perspective of resolvability. We then refine the approach in order to
obtain some novel results on the second-order achievable rates.Comment: Slightly extended version of the paper accepted at the 4th Workshop
on Physical-Layer Methods for Wireless Security during IEEE CNS 2017. v2:
Fixed typos and extended literature section in accordance with reviewers'
recommendation
Resolvability on Continuous Alphabets
We characterize the resolvability region for a large class of point-to-point
channels with continuous alphabets. In our direct result, we prove not only the
existence of good resolvability codebooks, but adapt an approach based on the
Chernoff-Hoeffding bound to the continuous case showing that the probability of
drawing an unsuitable codebook is doubly exponentially small. For the converse
part, we show that our previous elementary result carries over to the
continuous case easily under some mild continuity assumption.Comment: v2: Corrected inaccuracies in proof of direct part. Statement of
Theorem 3 slightly adapted; other results unchanged v3: Extended version of
camera ready version submitted to ISIT 201
Strong Converse for a Degraded Wiretap Channel via Active Hypothesis Testing
We establish an upper bound on the rate of codes for a wiretap channel with
public feedback for a fixed probability of error and secrecy parameter. As a
corollary, we obtain a strong converse for the capacity of a degraded wiretap
channel with public feedback. Our converse proof is based on a reduction of
active hypothesis testing for discriminating between two channels to coding for
wiretap channel with feedback.Comment: This paper was presented at Allerton 201
Informational Divergence Approximations to Product Distributions
The minimum rate needed to accurately approximate a product distribution
based on an unnormalized informational divergence is shown to be a mutual
information. This result subsumes results of Wyner on common information and
Han-Verd\'{u} on resolvability. The result also extends to cases where the
source distribution is unknown but the entropy is known
On Channel Resolvability in Presence of Feedback
We study the problem of generating an approximately i.i.d. string at the
output of a discrete memoryless channel using a limited amount of randomness at
its input in presence of causal noiseless feedback. Feedback does not decrease
the channel resolution, the minimum entropy rate required to achieve an
accurate approximation of an i.i.d. output string. However, we show that, at
least over a binary symmetric channel, a significantly larger resolvability
exponent (the exponential decay rate of the divergence between the output
distribution and product measure), compared to the best known achievable
resolvability exponent in a system without feedback, is possible. We show that
by employing a variable-length resolvability scheme and using an average number
of coin-flips per channel use, the average divergence between the distribution
of the output sequence and product measure decays exponentially fast in the
average length of output sequence with an exponent equal to
where is the mutual information developed across the channel.Comment: 8 pages, 4 figures; to be presented at the 54th Annual Allerton
Conference on Communication, Control, and Computin
- …