20 research outputs found
Fountain Capacity
Fountain codes are currently employed for reliable and efficient transmission of information via erasure channels with unknown erasure rates. This correspondence introduces the notion of fountain capacity for arbitrary channels. In contrast to the conventional definition of rate, in the fountain setup the definition of rate penalizes the reception of symbols by the receiver rather than their transmission. Fountain capacity measures the maximum rate compatible with reliable reception regardless of the erasure pattern. We show that fountain capacity and Shannon capacity are equal for stationary memoryless channels. In contrast, Shannon capacity may exceed fountain capacity if the channel has memory or is not stationary
Fountain Capacity
Fountain codes are currently employed for reliable and efficient transmission of information via erasure channels with unknown erasure rates. This correspondence introduces the notion of fountain capacity for arbitrary channels. In contrast to the conventional definition of rate, in the fountain setup the definition of rate penalizes the reception of symbols by the receiver rather than their transmission. Fountain capacity measures the maximum rate compatible with reliable reception regardless of the erasure pattern. We show that fountain capacity and Shannon capacity are equal for stationary memoryless channels. In contrast, Shannon capacity may exceed fountain capacity if the channel has memory or is not stationary
Computational Mechanics of Input-Output Processes: Structured transformations and the -transducer
Computational mechanics quantifies structure in a stochastic process via its
causal states, leading to the process's minimal, optimal predictor---the
-machine. We extend computational mechanics to communication channels
between two processes, obtaining an analogous optimal model---the
-transducer---of the stochastic mapping between them. Here, we lay
the foundation of a structural analysis of communication channels, treating
joint processes and processes with input. The result is a principled structural
analysis of mechanisms that support information flow between processes. It is
the first in a series on the structural information theory of memoryful
channels, channel composition, and allied conditional information measures.Comment: 30 pages, 19 figures;
http://csc.ucdavis.edu/~cmg/compmech/pubs/et1.htm; Updated to conform to
published version plus additional corrections and update
Feedback communication over unknown channels
Suppose Q is a family of discrete memoryless channels. An unknown member of Q will be available, with perfect, causal output feedback for communication. Is there a coding scheme (possibly with variable transmission time) that can achieve the Burnashev error exponent uniformly over Q ? For two families of channels we show that the answer is yes. Furthermore, for each of these two classes, in addition to achieve the maximum error exponent, it is possible to uniformly attain any given fraction of the channel capacity. Therefore, in terms of achievable rates and delay, there are situations in which the knowledge of the channel becomes irrelevant. In the second part of the thesis, we show that for arbitrary sets of channels the Burnashev error exponent cannot in general be uniformly achieved. In particular we give a sufficient condition for a pair of channels so that no coding strategy reaches Burnashev's exponent simultaneously on both channels. As a third part we study a scenario where communication is carried by first testing the channel by means of a training sequence, then coding according to the channel estimate. We provide an upper bound on the maximum achievable error exponent of such coding schemes. This bound is typically much lower than the maximum achievable error exponent over a channel with feedback. For example in the case of binary symmetric channels this bound has a slope that vanishes at capacity. This result suggests that in terms of error exponent, a good universal feedback scheme combines channel estimation with information delivery, rather than separating them. In the final chapter, we address the question of communicating quickly and reliably. We consider a simple situation of two message communication over a known channel with feedback. We propose a simple decoding rule, and show that it minimizes a weighted combination of the probability of error and decoding delay for a certain range of crossover probabilities and combination weights
Space Communications: Theory and Applications. Volume 3: Information Processing and Advanced Techniques. A Bibliography, 1958 - 1963
Annotated bibliography on information processing and advanced communication techniques - theory and applications of space communication
Reinforcement Learning
Brains rule the world, and brain-like computation is increasingly used in computers and electronic devices. Brain-like computation is about processing and interpreting data or directly putting forward and performing actions. Learning is a very important aspect. This book is on reinforcement learning which involves performing actions to achieve a goal. The first 11 chapters of this book describe and extend the scope of reinforcement learning. The remaining 11 chapters show that there is already wide usage in numerous fields. Reinforcement learning can tackle control tasks that are too complex for traditional, hand-designed, non-learning controllers. As learning computers can deal with technical complexities, the tasks of human operators remain to specify goals on increasingly higher levels. This book shows that reinforcement learning is a very dynamic area in terms of theory and applications and it shall stimulate and encourage new research in this field