634 research outputs found

    Perfect Omniscience, Perfect Secrecy and Steiner Tree Packing

    Get PDF
    We consider perfect secret key generation for a ``pairwise independent network'' model in which every pair of terminals share a random binary string, with the strings shared by distinct terminal pairs being mutually independent. The terminals are then allowed to communicate interactively over a public noiseless channel of unlimited capacity. All the terminals as well as an eavesdropper observe this communication. The objective is to generate a perfect secret key shared by a given set of terminals at the largest rate possible, and concealed from the eavesdropper. First, we show how the notion of perfect omniscience plays a central role in characterizing perfect secret key capacity. Second, a multigraph representation of the underlying secrecy model leads us to an efficient algorithm for perfect secret key generation based on maximal Steiner tree packing. This algorithm attains capacity when all the terminals seek to share a key, and, in general, attains at least half the capacity. Third, when a single ``helper'' terminal assists the remaining ``user'' terminals in generating a perfect secret key, we give necessary and sufficient conditions for the optimality of the algorithm; also, a ``weak'' helper is shown to be sufficient for optimality.Comment: accepted to the IEEE Transactions on Information Theor

    How Many Queries Will Resolve Common Randomness?

    Full text link
    A set of m terminals, observing correlated signals, communicate interactively to generate common randomness for a given subset of them. Knowing only the communication, how many direct queries of the value of the common randomness will resolve it? A general upper bound, valid for arbitrary signal alphabets, is developed for the number of such queries by using a query strategy that applies to all common randomness and associated communication. When the underlying signals are independent and identically distributed repetitions of m correlated random variables, the number of queries can be exponential in signal length. For this case, the mentioned upper bound is tight and leads to a single-letter formula for the largest query exponent, which coincides with the secret key capacity of a corresponding multiterminal source model. In fact, the upper bound constitutes a strong converse for the optimum query exponent, and implies also a new strong converse for secret key capacity. A key tool, estimating the size of a large probability set in terms of Renyi entropy, is interpreted separately, too, as a lossless block coding result for general sources. As a particularization, it yields the classic result for a discrete memoryless source.Comment: Accepted for publication in IEEE Transactions on Information Theor

    Universal Sampling Rate Distortion

    Full text link
    We examine the coordinated and universal rate-efficient sampling of a subset of correlated discrete memoryless sources followed by lossy compression of the sampled sources. The goal is to reconstruct a predesignated subset of sources within a specified level of distortion. The combined sampling mechanism and rate distortion code are universal in that they are devised to perform robustly without exact knowledge of the underlying joint probability distribution of the sources. In Bayesian as well as nonBayesian settings, single-letter characterizations are provided for the universal sampling rate distortion function for fixed-set sampling, independent random sampling and memoryless random sampling. It is illustrated how these sampling mechanisms are successively better. Our achievability proofs bring forth new schemes for joint source distribution-learning and lossy compression

    When is a Function Securely Computable?

    Full text link
    A subset of a set of terminals that observe correlated signals seek to compute a given function of the signals using public communication. It is required that the value of the function be kept secret from an eavesdropper with access to the communication. We show that the function is securely computable if and only if its entropy is less than the "aided secret key" capacity of an associated secrecy generation model, for which a single-letter characterization is provided
    • …
    corecore