13 research outputs found

    Correlation between Channel State and Information Source with Empirical Coordination Constraint

    Full text link
    Correlation between channel state and source symbol is under investigation for a joint source-channel coding problem. We investigate simultaneously the lossless transmission of information and the empirical coordination of channel inputs with the symbols of source and states. Empirical coordination is achievable if the sequences of source symbols, channel states, channel inputs and channel outputs are jointly typical for a target joint probability distribution. We characterize the joint distributions that are achievable under lossless decoding constraint. The performance of the coordination is evaluated by an objective function. For example, we determine the minimal distortion between symbols of source and channel inputs for lossless decoding. We show that the correlation source/channel state improves the feasibility of the transmission.Comment: Conference IEEE ITW 201

    Randomized Quantization and Source Coding with Constrained Output Distribution

    Full text link
    This paper studies fixed-rate randomized vector quantization under the constraint that the quantizer's output has a given fixed probability distribution. A general representation of randomized quantizers that includes the common models in the literature is introduced via appropriate mixtures of joint probability measures on the product of the source and reproduction alphabets. Using this representation and results from optimal transport theory, the existence of an optimal (minimum distortion) randomized quantizer having a given output distribution is shown under various conditions. For sources with densities and the mean square distortion measure, it is shown that this optimum can be attained by randomizing quantizers having convex codecells. For stationary and memoryless source and output distributions a rate-distortion theorem is proved, providing a single-letter expression for the optimum distortion in the limit of large block-lengths.Comment: To appear in the IEEE Transactions on Information Theor

    Statistical Atmospheric Parameter Retrieval Largely Benefits from Spatial-Spectral Image Compression

    Get PDF
    The Infrared Atmospheric Sounding Interferometer (IASI) is flying on board of the Metop satellite series, which is part of the EUMETSAT Polar System (EPS). Products obtained from IASI data represent a significant improvement in the accuracy and quality of the measurements used for meteorological models. Notably, IASI collects rich spectral information to derive temperature and moisture profiles –among other relevant trace gases–, essential for atmospheric forecasts and for the understanding of weather. Here, we investigate the impact of near-lossless and lossy compression on IASI L1C data when statistical retrieval algorithms are later applied. We search for those compression ratios that yield a positive impact on the accuracy of the statistical retrievals. The compression techniques help reduce certain amount of noise on the original data and, at the same time, incorporate spatial-spectral feature relations in an indirect way without increasing the computational complexity. We observed that compressing images, at relatively low bitrates, improves results in predicting temperature and dew point temperature, and we advocate that some amount of compression prior to model inversion is beneficial. This research can benefit the development of current and upcoming retrieval chains in infrared sounding and hyperspectral sensors

    Joint Empirical Coordination of Source and Channel

    Full text link
    In a decentralized and self-configuring network, the communication devices are considered as autonomous decision-makers that sense their environment and that implement optimal transmission schemes. It is essential that these autonomous devices cooperate and coordinate their actions, to ensure the reliability of the transmissions and the stability of the network. We study a point-to-point scenario in which the encoder and the decoder implement decentralized policies that are coordinated. The coordination is measured in terms of empirical frequency of symbols of source and channel. The encoder and the decoder perform a coding scheme such that the empirical distribution of the symbols is close to a target joint probability distribution. We characterize the set of achievable target probability distributions for a point-to-point source-channel model, in which the encoder is non-causal and the decoder is strictly causal i.e., it returns an action based on the observation of the past channel outputs. The objectives of the encoder and of the decoder, are captured by some utility function, evaluated with respect to the set of achievable target probability distributions. In this article, we investigate the maximization problem of a utility function that is common to both encoder and decoder. We show that the compression and the transmission of information are particular cases of the empirical coordination.Comment: accepted to IEEE Trans. on I

    Joint universal lossy coding and identification of stationary mixing sources with general alphabets

    Full text link
    We consider the problem of joint universal variable-rate lossy coding and identification for parametric classes of stationary β\beta-mixing sources with general (Polish) alphabets. Compression performance is measured in terms of Lagrangians, while identification performance is measured by the variational distance between the true source and the estimated source. Provided that the sources are mixing at a sufficiently fast rate and satisfy certain smoothness and Vapnik-Chervonenkis learnability conditions, it is shown that, for bounded metric distortions, there exist universal schemes for joint lossy compression and identification whose Lagrangian redundancies converge to zero as Vnlogn/n\sqrt{V_n \log n /n} as the block length nn tends to infinity, where VnV_n is the Vapnik-Chervonenkis dimension of a certain class of decision regions defined by the nn-dimensional marginal distributions of the sources; furthermore, for each nn, the decoder can identify nn-dimensional marginal of the active source up to a ball of radius O(Vnlogn/n)O(\sqrt{V_n\log n/n}) in variational distance, eventually with probability one. The results are supplemented by several examples of parametric sources satisfying the regularity conditions.Comment: 16 pages, 1 figure; accepted to IEEE Transactions on Information Theor

    Rate-Distortion via Markov Chain Monte Carlo

    Full text link
    We propose an approach to lossy source coding, utilizing ideas from Gibbs sampling, simulated annealing, and Markov Chain Monte Carlo (MCMC). The idea is to sample a reconstruction sequence from a Boltzmann distribution associated with an energy function that incorporates the distortion between the source and reconstruction, the compressibility of the reconstruction, and the point sought on the rate-distortion curve. To sample from this distribution, we use a `heat bath algorithm': Starting from an initial candidate reconstruction (say the original source sequence), at every iteration, an index i is chosen and the i-th sequence component is replaced by drawing from the conditional probability distribution for that component given all the rest. At the end of this process, the encoder conveys the reconstruction to the decoder using universal lossless compression. The complexity of each iteration is independent of the sequence length and only linearly dependent on a certain context parameter (which grows sub-logarithmically with the sequence length). We show that the proposed algorithms achieve optimum rate-distortion performance in the limits of large number of iterations, and sequence length, when employed on any stationary ergodic source. Experimentation shows promising initial results. Employing our lossy compressors on noisy data, with appropriately chosen distortion measure and level, followed by a simple de-randomization operation, results in a family of denoisers that compares favorably (both theoretically and in practice) with other MCMC-based schemes, and with the Discrete Universal Denoiser (DUDE).Comment: 35 pages, 16 figures, Submitted to IEEE Transactions on Information Theor

    Joint Universal Lossy Coding and Identification of Stationary Mixing Sources With General Alphabets

    Get PDF
    Abstract-In this paper, we consider the problem of joint universal variable-rate lossy coding and identification for parametric classes of stationary -mixing sources with general (Polish) alphabets. Compression performance is measured in terms of Lagrangians, while identification performance is measured by the variational distance between the true source and the estimated source. Provided that the sources are mixing at a sufficiently fast rate and satisfy certain smoothness and Vapnik-Chervonenkis (VC) learnability conditions, it is shown that, for bounded metric distortions, there exist universal schemes for joint lossy compression and identification whose Lagrangian redundancies converge to zero as V n log n=n as the block length n tends to infinity, where V n is the VC dimension of a certain class of decision regions defined by the n-dimensional marginal distributions of the sources; furthermore, for each n, the decoder can identify n-dimensional marginal of the active source up to a ball of radius O( V n log n=n) in variational distance, eventually with probability one. The results are supplemented by several examples of parametric sources satisfying the regularity conditions

    Statistical atmospheric parameter retrieval largely benefits from spatial-spectral image compression

    Get PDF
    The infrared atmospheric sounding interferometer (IASI) is flying on board of the Metop satellite series, which is part of the EUMETSAT Polar System. Products obtained from IASI data represent a significant improvement in the accuracy and quality of the measurements used for meteorological models. Notably, the IASI collects rich spectral information to derive temperature and moisture profiles, among other relevant trace gases, essential for atmospheric forecasts and for the understanding of weather. Here, we investigate the impact of near-lossless and lossy compression on IASI L1C data when statistical retrieval algorithms are later applied. We search for those compression ratios that yield a positive impact on the accuracy of the statistical retrievals. The compression techniques help reduce certain amount of noise on the original data and, at the same time, incorporate spatial-spectral feature relations in an indirect way without increasing the computational complexity. We observed that compressing images, at relatively low bit rates, improves results in predicting temperature and dew point temperature, and we advocate that some amount of compression prior to model inversion is beneficial. This research can benefit the development of current and upcoming retrieval chains in infrared sounding and hyperspectral sensors
    corecore