15 research outputs found

    Novel Lower Bounds on the Entropy Rate of Binary Hidden Markov Processes

    Full text link
    Recently, Samorodnitsky proved a strengthened version of Mrs. Gerber's Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected on a random subset of coordinates. Here, this result is applied for deriving novel lower bounds on the entropy rate of binary hidden Markov processes. For symmetric underlying Markov processes, our bound improves upon the best known bound in the very noisy regime. The nonsymmetric case is also considered, and explicit bounds are derived for Markov processes that satisfy the (1,∞)(1,\infty)-RLL constraint

    How to Quantize nn Outputs of a Binary Symmetric Channel to n−1n-1 Bits?

    Full text link
    Suppose that YnY^n is obtained by observing a uniform Bernoulli random vector XnX^n through a binary symmetric channel with crossover probability α\alpha. The "most informative Boolean function" conjecture postulates that the maximal mutual information between YnY^n and any Boolean function b(Xn)\mathrm{b}(X^n) is attained by a dictator function. In this paper, we consider the "complementary" case in which the Boolean function is replaced by f:{0,1}n→{0,1}n−1f:\left\{0,1\right\}^n\to\left\{0,1\right\}^{n-1}, namely, an n−1n-1 bit quantizer, and show that I(f(Xn);Yn)≤(n−1)⋅(1−h(α))I(f(X^n);Y^n)\leq (n-1)\cdot\left(1-h(\alpha)\right) for any such ff. Thus, in this case, the optimal function is of the form f(xn)=(x1,…,xn−1)f(x^n)=(x_1,\ldots,x_{n-1}).Comment: 5 pages, accepted ISIT 201

    Boolean functions: noise stability, non-interactive correlation distillation, and mutual information

    Full text link
    Let TϵT_{\epsilon} be the noise operator acting on Boolean functions f:{0,1}n→{0,1}f:\{0, 1\}^n\to \{0, 1\}, where ϵ∈[0,1/2]\epsilon\in[0, 1/2] is the noise parameter. Given α>1\alpha>1 and fixed mean Ef\mathbb{E} f, which Boolean function ff has the largest α\alpha-th moment E(Tϵf)α\mathbb{E}(T_\epsilon f)^\alpha? This question has close connections with noise stability of Boolean functions, the problem of non-interactive correlation distillation, and Courtade-Kumar's conjecture on the most informative Boolean function. In this paper, we characterize maximizers in some extremal settings, such as low noise (ϵ=ϵ(n)\epsilon=\epsilon(n) is close to 0), high noise (ϵ=ϵ(n)\epsilon=\epsilon(n) is close to 1/2), as well as when α=α(n)\alpha=\alpha(n) is large. Analogous results are also established in more general contexts, such as Boolean functions defined on discrete torus (Z/pZ)n(\mathbb{Z}/p\mathbb{Z})^n and the problem of noise stability in a tree model.Comment: Corrections of some inaccuracie
    corecore