47,007 research outputs found

    Memory effects can make the transmission capability of a communication channel uncomputable

    Full text link
    Most communication channels are subjected to noise. One of the goals of Information Theory is to add redundancy in the transmission of information so that the information is transmitted reliably and the amount of information transmitted through the channel is as large as possible. The maximum rate at which reliable transmission is possible is called the capacity. If the channel does not keep memory of its past, the capacity is given by a simple optimization problem and can be efficiently computed. The situation of channels with memory is less clear. Here we show that for channels with memory the capacity cannot be computed to within precision 1/5. Our result holds even if we consider one of the simplest families of such channels -information-stable finite state machine channels-, restrict the input and output of the channel to 4 and 1 bit respectively and allow 6 bits of memory.Comment: Improved presentation and clarified claim

    Computation with narrow CTCs

    Full text link
    We examine some variants of computation with closed timelike curves (CTCs), where various restrictions are imposed on the memory of the computer, and the information carrying capacity and range of the CTC. We give full characterizations of the classes of languages recognized by polynomial time probabilistic and quantum computers that can send a single classical bit to their own past. Such narrow CTCs are demonstrated to add the power of limited nondeterminism to deterministic computers, and lead to exponential speedup in constant-space probabilistic and quantum computation. We show that, given a time machine with constant negative delay, one can implement CTC-based computations without the need to know about the runtime beforehand.Comment: 16 pages. A few typo was correcte

    Lurching Toward Chernobyl: Dysfunctions of Real-Time Computation

    Get PDF
    Cognitive biological structures, social organizations, and computing machines operating in real time are subject to Rate Distortion Theorem constraints driven by the homology between information source uncertainty and free energy density. This exposes the unitary structure/environment system to a relentless entropic torrent compounded by sudden large deviations causing increased distortion between intent and impact, particularly as demands escalate. The phase transitions characteristic of information phenomena suggest that, rather than graceful decay under increasing load, these structures will undergo punctuated degradation akin to spontaneous symmetry breaking in physical systems. Rate distortion problems, that also affect internal structural dynamics, can become synergistic with limitations equivalent to the inattentional blindness of natural cognitive process. These mechanisms, and their interactions, are unlikely to scale well, so that, depending on architecture, enlarging the structure or its duties may lead to a crossover point at which added resources must be almost entirely devoted to ensuring system stability -- a form of allometric scaling familiar from biological examples. This suggests a critical need to tune architecture to problem type and system demand. A real-time computational structure and its environment are a unitary phenomenon, and environments are usually idiosyncratic. Thus the resulting path dependence in the development of pathology could often require an individualized approach to remediation more akin to an arduous psychiatric intervention than to the traditional engineering or medical quick fix. Failure to recognize the depth of these problems seems likely to produce a relentless chain of the Chernobyl-like failures that are necessary, bot often insufficient, for remediation under our system

    Gene Expression and its Discontents: Developmental disorders as dysfunctions of epigenetic cognition

    Get PDF
    Systems biology presently suffers the same mereological and sufficiency fallacies that haunt neural network models of high order cognition. Shifting perspective from the massively parallel space of gene matrix interactions to the grammar/syntax of the time series of expressed phenotypes using a cognitive paradigm permits import of techniques from statistical physics via the homology between information source uncertainty and free energy density. This produces a broad spectrum of possible statistical models of development and its pathologies in which epigenetic regulation and the effects of embedding environment are analogous to a tunable enzyme catalyst. A cognitive paradigm naturally incorporates memory, leading directly to models of epigenetic inheritance, as affected by environmental exposures, in the largest sense. Understanding gene expression, development, and their dysfunctions will require data analysis tools considerably more sophisticated than the present crop of simplistic models abducted from neural network studies or stochastic chemical reaction theory

    A Proof of Entropy Minimization for Outputs in Deletion Channels via Hidden Word Statistics

    Get PDF
    From the output produced by a memoryless deletion channel from a uniformly random input of known length nn, one obtains a posterior distribution on the channel input. The difference between the Shannon entropy of this distribution and that of the uniform prior measures the amount of information about the channel input which is conveyed by the output of length mm, and it is natural to ask for which outputs this is extremized. This question was posed in a previous work, where it was conjectured on the basis of experimental data that the entropy of the posterior is minimized and maximized by the constant strings 000\texttt{000}\ldots and 111\texttt{111}\ldots and the alternating strings 0101\texttt{0101}\ldots and 1010\texttt{1010}\ldots respectively. In the present work we confirm the minimization conjecture in the asymptotic limit using results from hidden word statistics. We show how the analytic-combinatorial methods of Flajolet, Szpankowski and Vall\'ee for dealing with the hidden pattern matching problem can be applied to resolve the case of fixed output length and nn\rightarrow\infty, by obtaining estimates for the entropy in terms of the moments of the posterior distribution and establishing its minimization via a measure of autocorrelation.Comment: 11 pages, 2 figure

    Skip-Sliding Window Codes

    Full text link
    Constrained coding is used widely in digital communication and storage systems. In this paper, we study a generalized sliding window constraint called the skip-sliding window. A skip-sliding window (SSW) code is defined in terms of the length LL of a sliding window, skip length JJ, and cost constraint EE in each sliding window. Each valid codeword of length L+kJL + kJ is determined by k+1k+1 windows of length LL where window ii starts at (iJ+1)(iJ + 1)th symbol for all non-negative integers ii such that iki \leq k; and the cost constraint EE in each window must be satisfied. In this work, two methods are given to enumerate the size of SSW codes and further refinements are made to reduce the enumeration complexity. Using the proposed enumeration methods, the noiseless capacity of binary SSW codes is determined and observations such as greater capacity than other classes of codes are made. Moreover, some noisy capacity bounds are given. SSW coding constraints arise in various applications including simultaneous energy and information transfer.Comment: 28 pages, 11 figure

    Algorithmic complexity of quantum capacity

    Full text link
    Recently the theory of communication developed by Shannon has been extended to the quantum realm by exploiting the rules of quantum theory. This latter stems on complex vector spaces. However complex (as well as real) numbers are just idealizations and they are not available in practice where we can only deal with rational numbers. This fact naturally leads to the question of whether the developed notions of capacities for quantum channels truly catch their ability to transmit information. Here we answer this question for the quantum capacity. To this end we resort to the notion of semi-computability in order to approximately (by rational numbers) describe quantum states and quantum channel maps. Then we introduce algorithmic entropies (like algorithmic quantum coherent information) and derive relevant properties for them. Finally we define algorithmic quantum capacity and prove that it equals the standard one
    corecore