2,526 research outputs found

    On the Shannon Cipher System With a Wiretapper Guessing Subject to Distortion and Reliability Requirements

    Full text link
    In this paper we discuss the processes in the Shannon cipher system with discrete memoryless source and a guessing wiretapper. The wiretapper observes a cryptogram of NN-vector of ciphered messages in the public channel and tries to guess successively the vector of messages within given distortion level Ξ”\Delta and small probability of error less than exp⁑{βˆ’NE}\exp \{-NE\} with positive reliability index EE. The security of the system is measured by the expected number of guesses which wiretapper needs for the approximate reconstruction of the vector of source messages. The distortion, the reliability criteria and the possibility of upper limiting the number of guesses extend the approach studied by Merhav and Arikan. A single-letter characterization is given for the region of pairs (RL,R)(R_L,R) (of the rate RLR_L of the maximum number of guesses L(N)L(N) and the rate RR of the average number of guesses) in dependence on key rate RKR_K, distortion level Ξ”\Delta and reliability EE.Comment: 14 pages, 3 figures, Submitted to IEEE Transactions on Information Theor

    General formulas for fixed-length quantum entanglement concentration

    Full text link
    General formulas of entanglement concentration are derived by using an information-spectrum approach for the i.i.d. sequences and the general sequences of partially entangled pure states. That is, we derive general relations between the performance of the entanglement concentration and the eigenvalues of the partially traced state. The achievable rates with constant constraints and those with exponential constraints can be calculated from these formulas.Comment: This paper revised because the previouse version has some mistake

    Lossy Compression via Sparse Linear Regression: Performance under Minimum-distance Encoding

    Full text link
    We study a new class of codes for lossy compression with the squared-error distortion criterion, designed using the statistical framework of high-dimensional linear regression. Codewords are linear combinations of subsets of columns of a design matrix. Called a Sparse Superposition or Sparse Regression codebook, this structure is motivated by an analogous construction proposed recently by Barron and Joseph for communication over an AWGN channel. For i.i.d Gaussian sources and minimum-distance encoding, we show that such a code can attain the Shannon rate-distortion function with the optimal error exponent, for all distortions below a specified value. It is also shown that sparse regression codes are robust in the following sense: a codebook designed to compress an i.i.d Gaussian source of variance Οƒ2\sigma^2 with (squared-error) distortion DD can compress any ergodic source of variance less than Οƒ2\sigma^2 to within distortion DD. Thus the sparse regression ensemble retains many of the good covering properties of the i.i.d random Gaussian ensemble, while having having a compact representation in terms of a matrix whose size is a low-order polynomial in the block-length.Comment: This version corrects a typo in the statement of Theorem 2 of the published pape

    The Reliability Function of Lossy Source-Channel Coding of Variable-Length Codes with Feedback

    Full text link
    We consider transmission of discrete memoryless sources (DMSes) across discrete memoryless channels (DMCs) using variable-length lossy source-channel codes with feedback. The reliability function (optimum error exponent) is shown to be equal to max⁑{0,B(1βˆ’R(D)/C)},\max\{0, B(1-R(D)/C)\}, where R(D)R(D) is the rate-distortion function of the source, BB is the maximum relative entropy between output distributions of the DMC, and CC is the Shannon capacity of the channel. We show that, in this setting and in this asymptotic regime, separate source-channel coding is, in fact, optimal.Comment: Accepted to IEEE Transactions on Information Theory in Apr. 201
    • …
    corecore