266,931 research outputs found
The Thermodynamics of Network Coding, and an Algorithmic Refinement of the Principle of Maximum Entropy
The principle of maximum entropy (Maxent) is often used to obtain prior
probability distributions as a method to obtain a Gibbs measure under some
restriction giving the probability that a system will be in a certain state
compared to the rest of the elements in the distribution. Because classical
entropy-based Maxent collapses cases confounding all distinct degrees of
randomness and pseudo-randomness, here we take into consideration the
generative mechanism of the systems considered in the ensemble to separate
objects that may comply with the principle under some restriction and whose
entropy is maximal but may be generated recursively from those that are
actually algorithmically random offering a refinement to classical Maxent. We
take advantage of a causal algorithmic calculus to derive a thermodynamic-like
result based on how difficult it is to reprogram a computer code. Using the
distinction between computable and algorithmic randomness we quantify the cost
in information loss associated with reprogramming. To illustrate this we apply
the algorithmic refinement to Maxent on graphs and introduce a Maximal
Algorithmic Randomness Preferential Attachment (MARPA) Algorithm, a
generalisation over previous approaches. We discuss practical implications of
evaluation of network randomness. Our analysis provides insight in that the
reprogrammability asymmetry appears to originate from a non-monotonic
relationship to algorithmic probability. Our analysis motivates further
analysis of the origin and consequences of the aforementioned asymmetries,
reprogrammability, and computation.Comment: 30 page
Algorithmic Thermodynamics
Algorithmic entropy can be seen as a special case of entropy as studied in
statistical mechanics. This viewpoint allows us to apply many techniques
developed for use in thermodynamics to the subject of algorithmic information
theory. In particular, suppose we fix a universal prefix-free Turing machine
and let X be the set of programs that halt for this machine. Then we can regard
X as a set of 'microstates', and treat any function on X as an 'observable'.
For any collection of observables, we can study the Gibbs ensemble that
maximizes entropy subject to constraints on expected values of these
observables. We illustrate this by taking the log runtime, length, and output
of a program as observables analogous to the energy E, volume V and number of
molecules N in a container of gas. The conjugate variables of these observables
allow us to define quantities which we call the 'algorithmic temperature' T,
'algorithmic pressure' P and algorithmic potential' mu, since they are
analogous to the temperature, pressure and chemical potential. We derive an
analogue of the fundamental thermodynamic relation dE = T dS - P d V + mu dN,
and use it to study thermodynamic cycles analogous to those for heat engines.
We also investigate the values of T, P and mu for which the partition function
converges. At some points on the boundary of this domain of convergence, the
partition function becomes uncomputable. Indeed, at these points the partition
function itself has nontrivial algorithmic entropy.Comment: 20 pages, one encapsulated postscript figur
- …