2,526 research outputs found
On the Shannon Cipher System With a Wiretapper Guessing Subject to Distortion and Reliability Requirements
In this paper we discuss the processes in the Shannon cipher system with
discrete memoryless source and a guessing wiretapper. The wiretapper observes a
cryptogram of -vector of ciphered messages in the public channel and tries
to guess successively the vector of messages within given distortion level
and small probability of error less than with positive
reliability index . The security of the system is measured by the expected
number of guesses which wiretapper needs for the approximate reconstruction of
the vector of source messages. The distortion, the reliability criteria and the
possibility of upper limiting the number of guesses extend the approach studied
by Merhav and Arikan. A single-letter characterization is given for the region
of pairs (of the rate of the maximum number of guesses
and the rate of the average number of guesses) in dependence on key rate
, distortion level and reliability .Comment: 14 pages, 3 figures, Submitted to IEEE Transactions on Information
Theor
General formulas for fixed-length quantum entanglement concentration
General formulas of entanglement concentration are derived by using an
information-spectrum approach for the i.i.d. sequences and the general
sequences of partially entangled pure states. That is, we derive general
relations between the performance of the entanglement concentration and the
eigenvalues of the partially traced state. The achievable rates with constant
constraints and those with exponential constraints can be calculated from these
formulas.Comment: This paper revised because the previouse version has some mistake
Lossy Compression via Sparse Linear Regression: Performance under Minimum-distance Encoding
We study a new class of codes for lossy compression with the squared-error
distortion criterion, designed using the statistical framework of
high-dimensional linear regression. Codewords are linear combinations of
subsets of columns of a design matrix. Called a Sparse Superposition or Sparse
Regression codebook, this structure is motivated by an analogous construction
proposed recently by Barron and Joseph for communication over an AWGN channel.
For i.i.d Gaussian sources and minimum-distance encoding, we show that such a
code can attain the Shannon rate-distortion function with the optimal error
exponent, for all distortions below a specified value. It is also shown that
sparse regression codes are robust in the following sense: a codebook designed
to compress an i.i.d Gaussian source of variance with
(squared-error) distortion can compress any ergodic source of variance less
than to within distortion . Thus the sparse regression ensemble
retains many of the good covering properties of the i.i.d random Gaussian
ensemble, while having having a compact representation in terms of a matrix
whose size is a low-order polynomial in the block-length.Comment: This version corrects a typo in the statement of Theorem 2 of the
published pape
The Reliability Function of Lossy Source-Channel Coding of Variable-Length Codes with Feedback
We consider transmission of discrete memoryless sources (DMSes) across
discrete memoryless channels (DMCs) using variable-length lossy source-channel
codes with feedback. The reliability function (optimum error exponent) is shown
to be equal to where is the rate-distortion
function of the source, is the maximum relative entropy between output
distributions of the DMC, and is the Shannon capacity of the channel. We
show that, in this setting and in this asymptotic regime, separate
source-channel coding is, in fact, optimal.Comment: Accepted to IEEE Transactions on Information Theory in Apr. 201
- β¦