702 research outputs found

    Information capacity of optical fiber channels with zero average dispersion

    Get PDF
    We study the statistics of optical data transmission in a noisy nonlinear fiber channel with a weak dispersion management and zero average dispersion. Applying path integral methods we have found exactly the probability density functions of channel output both for a non-linear noisy channel and for a linear channel with additive and multiplicative noise. We have obtained analytically a lower bound estimate for the Shannon capacity of considered nonlinear fiber channel.Comment: 4 pages, subbmited to Phys. Rev. Let

    Quantum reading capacity: General definition and bounds

    Get PDF
    Quantum reading refers to the task of reading out classical information stored in a read-only memory device. In any such protocol, the transmitter and receiver are in the same physical location, and the goal of such a protocol is to use these devices (modeled by independent quantum channels), coupled with a quantum strategy, to read out as much information as possible from a memory device, such as a CD or DVD. As a consequence of the physical setup of quantum reading, the most natural and general definition for quantum reading capacity should allow for an adaptive operation after each call to the channel, and this is how we define quantum reading capacity in this paper. We also establish several bounds on quantum reading capacity, and we introduce an environment-parametrized memory cell with associated environment states, delivering second-order and strong converse bounds for its quantum reading capacity. We calculate the quantum reading capacities for some exemplary memory cells, including a thermal memory cell, a qudit erasure memory cell, and a qudit depolarizing memory cell. We finally provide an explicit example to illustrate the advantage of using an adaptive strategy in the context of zero-error quantum reading capacity.Comment: v3: 17 pages, 2 figures, final version published in IEEE Transactions on Information Theor

    Zero-error capacity of binary channels with memory

    Full text link
    We begin a systematic study of the problem of the zero--error capacity of noisy binary channels with memory and solve some of the non--trivial cases.Comment: 10 pages. This paper is the revised version of our previous paper having the same title, published on ArXiV on February 3, 2014. We complete Theorem 2 of the previous version by showing here that our previous construction is asymptotically optimal. This proves that the isometric triangles yield different capacities. The new manuscript differs from the old one by the addition of one more pag

    Asymptotics of input-constrained binary symmetric channel capacity

    Get PDF
    We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop (2004) 117--122], we derive an asymptotic formula (when the noise parameter is small) for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a BSC. Using this result, we establish an asymptotic formula for the capacity of a BSC with input process supported on an irreducible finite type constraint, as the noise parameter tends to zero.Comment: Published in at http://dx.doi.org/10.1214/08-AAP570 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore