The persistent mutual information (PMI) is a complexity measure for
stochastic processes. It is related to well-known complexity measures like
excess entropy or statistical complexity. Essentially it is a variation of the
excess entropy so that it can be interpreted as a specific measure of system
internal memory. The PMI was first introduced in 2010 by Ball, Diakonova and
MacKay as a measure for (strong) emergence. In this paper we define the PMI
mathematically and investigate the relation to excess entropy and statistical
complexity. In particular we prove that the excess entropy is an upper bound of
the PMI. Furthermore we show some properties of the PMI and calculate it
explicitly for some example processes. We also discuss to what extend it is a
measure for emergence and compare it with alternative approaches used to
formalize emergence.Comment: 45 pages excerpt of Diploma-Thesi