2,492 research outputs found

    A statistical mechanical interpretation of instantaneous codes

    Full text link
    In this paper we develop a statistical mechanical interpretation of the noiseless source coding scheme based on an absolutely optimal instantaneous code. The notions in statistical mechanics such as statistical mechanical entropy, temperature, and thermal equilibrium are translated into the context of noiseless source coding. Especially, it is discovered that the temperature 1 corresponds to the average codeword length of an instantaneous code in this statistical mechanical interpretation of noiseless source coding scheme. This correspondence is also verified by the investigation using box-counting dimension. Using the notion of temperature and statistical mechanical arguments, some information-theoretic relations can be derived in the manner which appeals to intuition.Comment: 5 pages, Proceedings of the 2007 IEEE International Symposium on Information Theory, pp.1906 - 1910, Nice, France, June 24 - 29, 200

    The Tsallis entropy and the Shannon entropy of a universal probability

    Full text link
    We study the properties of Tsallis entropy and Shannon entropy from the point of view of algorithmic randomness. In algorithmic information theory, there are two equivalent ways to define the program-size complexity K(s) of a given finite binary string s. In the standard way, K(s) is defined as the length of the shortest input string for the universal self-delimiting Turing machine to output s. In the other way, the so-called universal probability m is introduced first, and then K(s) is defined as -log_2 m(s) without reference to the concept of program-size. In this paper, we investigate the properties of the Shannon entropy, the power sum, and the Tsallis entropy of a universal probability by means of the notion of program-size complexity. We determine the convergence or divergence of each of these three quantities, and evaluate its degree of randomness if it converges.Comment: 5 pages, to appear in the Proceedings of the 2008 IEEE International Symposium on Information Theory, Toronto, ON, Canada, July 6 - 11, 200

    Properties of optimal prefix-free machines as instantaneous codes

    Full text link
    The optimal prefix-free machine U is a universal decoding algorithm used to define the notion of program-size complexity H(s) for a finite binary string s. Since the set of all halting inputs for U is chosen to form a prefix-free set, the optimal prefix-free machine U can be regarded as an instantaneous code for noiseless source coding scheme. In this paper, we investigate the properties of optimal prefix-free machines as instantaneous codes. In particular, we investigate the properties of the set U^{-1}(s) of codewords associated with a symbol s. Namely, we investigate the number of codewords in U^{-1}(s) and the distribution of codewords in U^{-1}(s) for each symbol s, using the toolkit of algorithmic information theory.Comment: 5 pages, no figures, final manuscript to appear in the Proceedings of the 2010 IEEE Information Theory Workshop, Dublin, Ireland, August 30 - September 3, 201

    An extension of Chaitin's halting probability \Omega to a measurement operator in an infinite dimensional quantum system

    Full text link
    This paper proposes an extension of Chaitin's halting probability \Omega to a measurement operator in an infinite dimensional quantum system. Chaitin's \Omega is defined as the probability that the universal self-delimiting Turing machine U halts, and plays a central role in the development of algorithmic information theory. In the theory, there are two equivalent ways to define the program-size complexity H(s) of a given finite binary string s. In the standard way, H(s) is defined as the length of the shortest input string for U to output s. In the other way, the so-called universal probability m is introduced first, and then H(s) is defined as -log_2 m(s) without reference to the concept of program-size. Mathematically, the statistics of outcomes in a quantum measurement are described by a positive operator-valued measure (POVM) in the most general setting. Based on the theory of computability structures on a Banach space developed by Pour-El and Richards, we extend the universal probability to an analogue of POVM in an infinite dimensional quantum system, called a universal semi-POVM. We also give another characterization of Chaitin's \Omega numbers by universal probabilities. Then, based on this characterization, we propose to define an extension of \Omega as a sum of the POVM elements of a universal semi-POVM. The validity of this definition is discussed. In what follows, we introduce an operator version \hat{H}(s) of H(s) in a Hilbert space of infinite dimension using a universal semi-POVM, and study its properties.Comment: 24 pages, LaTeX2e, no figures, accepted for publication in Mathematical Logic Quarterly: The title was slightly changed and a section on an operator-valued algorithmic information theory was adde

    Fluctuation in e-mail sizes weakens power-law correlations in e-mail flow

    Full text link
    Power-law correlations have been observed in packet flow over the Internet. The possible origin of these correlations includes demand for Internet services. We observe the demand for e-mail services in an organization, and analyze correlations in the flow and the sequence of send requests using a Detrended Fluctuation Analysis (DFA). The correlation in the flow is found to be weaker than that in the send requests. Four types of artificial flow are constructed to investigate the effects of fluctuations in e-mail sizes. As a result, we find that the correlation in the flow originates from that in the sequence of send requests. The strength of the power-law correlation decreases as a function of the ratio of the standard deviation of e-mail sizes to their average.Comment: 8 pages, 6 figures, EPJB accepte
    corecore