Skip to main content
Article thumbnail
Location of Repository

Musing upon Information Theory

By Editorial Committee Helmut Bölcskei, Meir Feder, Joerg Kliewer, Andy Singer and Te Sun Han

Abstract

Information Theory can be primarily regarded as a discipline that tries to link two different kinds of quantities: one is operational quantities defined by using operational concepts such as source, channel, capacity, encoder, decoder, codeword length, compression rate, transmission rate, probability of error (or convergence rate of error probability) and so on; whereas the other is information-theoretic quantities such as entropy, divergence, mutual information, and so on. The theoretical and mathematical core of information theory is currently called the Shannon Theory, which was initiated in 1948 by Shannon [1]; it is six decades ago. In this connection, for instance, Gray and Ornstein [2, p. 294] argues that “A principal goal of the Shannon Theory is to prove coding theorems relating such operational capacities to informationtheoretic extremum problems; that is, to quantities involvin

Year: 2013
OAI identifier: oai:CiteSeerX.psu:10.1.1.352.3319
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.itsoc.org/publicati... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.