6 research outputs found

    An entropy maximization problem related to optical communication

    Get PDF
    Motivated by a problem in optical communication, we consider the general problem of maximizing the entropy of a stationary random process that is subject to an average transition cost constraint. Using a recent result of Justenson and Hoholdt, we present an exact solution to the problem and suggest a class of finite state encoders that give a good approximation to the exact solution

    On the VC-Dimension of Binary Codes

    Full text link
    We investigate the asymptotic rates of length-nn binary codes with VC-dimension at most dndn and minimum distance at least δn\delta n. Two upper bounds are obtained, one as a simple corollary of a result by Haussler and the other via a shortening approach combining Sauer-Shelah lemma and the linear programming bound. Two lower bounds are given using Gilbert-Varshamov type arguments over constant-weight and Markov-type sets

    Computable Lower Bounds for Capacities of Input-Driven Finite-State Channels

    Full text link
    This paper studies the capacities of input-driven finite-state channels, i.e., channels whose current state is a time-invariant deterministic function of the previous state and the current input. We lower bound the capacity of such a channel using a dynamic programming formulation of a bound on the maximum reverse directed information rate. We show that the dynamic programming-based bounds can be simplified by solving the corresponding Bellman equation explicitly. In particular, we provide analytical lower bounds on the capacities of (d,k)(d, k)-runlength-limited input-constrained binary symmetric and binary erasure channels. Furthermore, we provide a single-letter lower bound based on a class of input distributions with memory.Comment: 9 pages, 8 figures, submitted to International Symposium on Information Theory, 202

    Spectrum shaping with Markov chains

    Get PDF

    On row-by-row coding for 2-D constraints

    Full text link
    A constant-rate encoder--decoder pair is presented for a fairly large family of two-dimensional (2-D) constraints. Encoding and decoding is done in a row-by-row manner, and is sliding-block decodable. Essentially, the 2-D constraint is turned into a set of independent and relatively simple one-dimensional (1-D) constraints; this is done by dividing the array into fixed-width vertical strips. Each row in the strip is seen as a symbol, and a graph presentation of the respective 1-D constraint is constructed. The maxentropic stationary Markov chain on this graph is next considered: a perturbed version of the corresponding probability distribution on the edges of the graph is used in order to build an encoder which operates in parallel on the strips. This perturbation is found by means of a network flow, with upper and lower bounds on the flow through the edges. A key part of the encoder is an enumerative coder for constant-weight binary words. A fast realization of this coder is shown, using floating-point arithmetic
    corecore