62 research outputs found

    Information-Distilling Quantizers

    Full text link
    Let XX and YY be dependent random variables. This paper considers the problem of designing a scalar quantizer for YY to maximize the mutual information between the quantizer's output and XX, and develops fundamental properties and bounds for this form of quantization, which is connected to the log-loss distortion criterion. The main focus is the regime of low I(X;Y)I(X;Y), where it is shown that, if XX is binary, a constant fraction of the mutual information can always be preserved using O(log(1/I(X;Y)))\mathcal{O}(\log(1/I(X;Y))) quantization levels, and there exist distributions for which this many quantization levels are necessary. Furthermore, for larger finite alphabets 2<X<2 < |\mathcal{X}| < \infty, it is established that an η\eta-fraction of the mutual information can be preserved using roughly (log(X/I(X;Y)))η(X1)(\log(| \mathcal{X} | /I(X;Y)))^{\eta\cdot(|\mathcal{X}| - 1)} quantization levels

    A Recursive Quantizer Design Algorithm for Binary-Input Discrete Memoryless Channels

    Get PDF
    The optimal quantization of the outputs of binary-input discrete memoryless channels is considered, whereby the optimal quantizer preserves at least a constant α-fraction of the original mutual information, with the smallest output cardinality. Two recursive methods with top-down and bottom-up approaches are developed; these methods lead to a new necessary condition for the recursive quantizer design. An efficient algorithm with linear complexity, based on dynamic programming and the new necessary optimality condition, is proposed.This work has been funded in part by the European Research Council under grant 725411, and by the Spanish Ministry of Economy and Competitiveness under grant TEC2016-78434-C3-1-R
    corecore