10 research outputs found

    Nonasymptotic noisy lossy source coding

    Get PDF
    This paper shows new general nonasymptotic achievability and converse bounds and performs their dispersion analysis for the lossy compression problem in which the compressor observes the source through a noisy channel. While this problem is asymptotically equivalent to a noiseless lossy source coding problem with a modified distortion function, nonasymptotically there is a noticeable gap in how fast their minimum achievable coding rates approach the common rate-distortion function, as evidenced both by the refined asymptotic analysis (dispersion) and the numerical results. The size of the gap between the dispersions of the noisy problem and the asymptotically equivalent noiseless problem depends on the stochastic variability of the channel through which the compressor observes the source.Comment: IEEE Transactions on Information Theory, 201

    Infinite Divisibility of Information

    Full text link
    We study an information analogue of infinitely divisible probability distributions, where the i.i.d. sum is replaced by the joint distribution of an i.i.d. sequence. A random variable XX is called informationally infinitely divisible if, for any n1n\ge1, there exists an i.i.d. sequence of random variables Z1,,ZnZ_{1},\ldots,Z_{n} that contains the same information as XX, i.e., there exists an injective function ff such that X=f(Z1,,Zn)X=f(Z_{1},\ldots,Z_{n}). While there does not exist informationally infinitely divisible discrete random variable, we show that any discrete random variable XX has a bounded multiplicative gap to infinite divisibility, that is, if we remove the injectivity requirement on ff, then there exists i.i.d. Z1,,ZnZ_{1},\ldots,Z_{n} and ff satisfying X=f(Z1,,Zn)X=f(Z_{1},\ldots,Z_{n}), and the entropy satisfies H(X)/nH(Z1)1.59H(X)/n+2.43H(X)/n\le H(Z_{1})\le1.59H(X)/n+2.43. We also study a new class of discrete probability distributions, called spectral infinitely divisible distributions, where we can remove the multiplicative gap 1.591.59. Furthermore, we study the case where X=(Y1,,Ym)X=(Y_{1},\ldots,Y_{m}) is itself an i.i.d. sequence, m2m\ge2, for which the multiplicative gap 1.591.59 can be replaced by 1+5(logm)/m1+5\sqrt{(\log m)/m}. This means that as mm increases, (Y1,,Ym)(Y_{1},\ldots,Y_{m}) becomes closer to being spectral infinitely divisible in a uniform manner. This can be regarded as an information analogue of Kolmogorov's uniform theorem. Applications of our result include independent component analysis, distributed storage with a secrecy constraint, and distributed random number generation.Comment: 22 page

    Hardware-Limited Task-Based Quantization

    Get PDF
    Quantization plays a critical role in digital signal processing systems. Quantizers are typically designed to obtain an accurate digital representation of the input signal, operating independently of the system task, and are commonly implemented using serial scalar analog-to-digital converters (ADCs). In this work, we study hardware-limited task-based quantization, where a system utilizing a serial scalar ADC is designed to provide a suitable representation in order to allow the recovery of a parameter vector underlying the input signal. We propose hardware-limited task-based quantization systems for a fixed and finite quantization resolution, and characterize their achievable distortion. We then apply the analysis to the practical setups of channel estimation and eigen-spectrum recovery from quantized measurements. Our results illustrate that properly designed hardware-limited systems can approach the optimal performance achievable with vector quantizers, and that by taking the underlying task into account, the quantization error can be made negligible with a relatively small number of bits

    Hardware-Limited Task-Based Quantization

    Get PDF
    Quantization plays a critical role in digital signal processing systems. Quantizers are typically designed to obtain an accurate digital representation of the input signal, operating independently of the system task, and are commonly implemented using serial scalar analog-to-digital converters (ADCs). In this work, we study hardware-limited task-based quantization, where a system utilizing a serial scalar ADC is designed to provide a suitable representation in order to allow the recovery of a parameter vector underlying the input signal. We propose hardware-limited task-based quantization systems for a fixed and finite quantization resolution, and characterize their achievable distortion. We then apply the analysis to the practical setups of channel estimation and eigen-spectrum recovery from quantized measurements. Our results illustrate that properly designed hardware-limited systems can approach the optimal performance achievable with vector quantizers, and that by taking the underlying task into account, the quantization error can be made negligible with a relatively small number of bits

    Sampling of the Wiener Process for Remote Estimation over a Channel with Random Delay

    Full text link
    In this paper, we consider a problem of sampling a Wiener process, with samples forwarded to a remote estimator over a channel that is modeled as a queue. The estimator reconstructs an estimate of the real-time signal value from causally received samples. We study the optimal online sampling strategy that minimizes the mean square estimation error subject to a sampling rate constraint. We prove that the optimal sampling strategy is a threshold policy, and find the optimal threshold. This threshold is determined by how much the Wiener process varies during the random service time and the maximum allowed sampling rate. Further, if the sampling times are independent of the observed Wiener process, the above sampling problem for minimizing the estimation error is equivalent to a sampling problem for minimizing the age of information. This reveals an interesting connection between the age of information and remote estimation error. Our comparisons show that the estimation error achieved by the optimal sampling policy can be much smaller than those of age-optimal sampling, zero-wait sampling, and periodic sampling.Comment: Accepted by IEEE Transactions on Information Theor
    corecore