9,305 research outputs found

    Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Non-stationary/Unstable Linear Systems

    Full text link
    Stabilization of non-stationary linear systems over noisy communication channels is considered. Stochastically stable sources, and unstable but noise-free or bounded-noise systems have been extensively studied in information theory and control theory literature since 1970s, with a renewed interest in the past decade. There have also been studies on non-causal and causal coding of unstable/non-stationary linear Gaussian sources. In this paper, tight necessary and sufficient conditions for stochastic stabilizability of unstable (non-stationary) possibly multi-dimensional linear systems driven by Gaussian noise over discrete channels (possibly with memory and feedback) are presented. Stochastic stability notions include recurrence, asymptotic mean stationarity and sample path ergodicity, and the existence of finite second moments. Our constructive proof uses random-time state-dependent stochastic drift criteria for stabilization of Markov chains. For asymptotic mean stationarity (and thus sample path ergodicity), it is sufficient that the capacity of a channel is (strictly) greater than the sum of the logarithms of the unstable pole magnitudes for memoryless channels and a class of channels with memory. This condition is also necessary under a mild technical condition. Sufficient conditions for the existence of finite average second moments for such systems driven by unbounded noise are provided.Comment: To appear in IEEE Transactions on Information Theor

    Universal Polar Codes for More Capable and Less Noisy Channels and Sources

    Full text link
    We prove two results on the universality of polar codes for source coding and channel communication. First, we show that for any polar code built for a source PX,ZP_{X,Z} there exists a slightly modified polar code - having the same rate, the same encoding and decoding complexity and the same error rate - that is universal for every source PX,YP_{X,Y} when using successive cancellation decoding, at least when the channel PY∣XP_{Y|X} is more capable than PZ∣XP_{Z|X} and PXP_X is such that it maximizes I(X;Y)−I(X;Z)I(X;Y) - I(X;Z) for the given channels PY∣XP_{Y|X} and PZ∣XP_{Z|X}. This result extends to channel coding for discrete memoryless channels. Second, we prove that polar codes using successive cancellation decoding are universal for less noisy discrete memoryless channels.Comment: 10 pages, 3 figure

    An Achievable Rate Region for the Broadcast Channel with Feedback

    Full text link
    A single-letter achievable rate region is proposed for the two-receiver discrete memoryless broadcast channel with generalized feedback. The coding strategy involves block-Markov superposition coding, using Marton's coding scheme for the broadcast channel without feedback as the starting point. If the message rates in the Marton scheme are too high to be decoded at the end of a block, each receiver is left with a list of messages compatible with its output. Resolution information is sent in the following block to enable each receiver to resolve its list. The key observation is that the resolution information of the first receiver is correlated with that of the second. This correlated information is efficiently transmitted via joint source-channel coding, using ideas similar to the Han-Costa coding scheme. Using the result, we obtain an achievable rate region for the stochastically degraded AWGN broadcast channel with noisy feedback from only one receiver. It is shown that this region is strictly larger than the no-feedback capacity region.Comment: To appear in IEEE Transactions on Information Theory. Contains example of AWGN Broadcast Channel with noisy feedbac

    Secrecy Through Synchronization Errors

    Full text link
    In this paper, we propose a transmission scheme that achieves information theoretic security, without making assumptions on the eavesdropper's channel. This is achieved by a transmitter that deliberately introduces synchronization errors (insertions and/or deletions) based on a shared source of randomness. The intended receiver, having access to the same shared source of randomness as the transmitter, can resynchronize the received sequence. On the other hand, the eavesdropper's channel remains a synchronization error channel. We prove a secrecy capacity theorem, provide a lower bound on the secrecy capacity, and propose numerical methods to evaluate it.Comment: 5 pages, 6 figures, submitted to ISIT 201

    Classical capacity of a qubit depolarizing channel with memory

    Full text link
    The classical product state capacity of a noisy quantum channel with memory is investigated. A forgetful noise-memory channel is constructed by Markov switching between two depolarizing channels which introduces non-Markovian noise correlations between successive channel uses. The computation of the capacity is reduced to an entropy computation for a function of a Markov process. A reformulation in terms of algebraic measures then enables its calculation. The effects of the hidden-Markovian memory on the capacity are explored. An increase in noise-correlations is found to increase the capacity
    • …
    corecore