The maximum entropy principle (MEP) is a method for obtaining the most likely
distribution functions of observables from statistical systems, by maximizing
entropy under constraints. The MEP has found hundreds of applications in
ergodic and Markovian systems in statistical mechanics, information theory, and
statistics. For several decades there exists an ongoing controversy whether the
notion of the maximum entropy principle can be extended in a meaningful way to
non-extensive, non-ergodic, and complex statistical systems and processes. In
this paper we start by reviewing how Boltzmann-Gibbs-Shannon entropy is related
to multiplicities of independent random processes. We then show how the
relaxation of independence naturally leads to the most general entropies that
are compatible with the first three Shannon-Khinchin axioms, the
(c,d)-entropies. We demonstrate that the MEP is a perfectly consistent concept
for non-ergodic and complex statistical systems if their relative entropy can
be factored into a generalized multiplicity and a constraint term. The problem
of finding such a factorization reduces to finding an appropriate
representation of relative entropy in a linear basis. In a particular example
we show that path-dependent random processes with memory naturally require
specific generalized entropies. The example is the first exact derivation of a
generalized entropy from the microscopic properties of a path-dependent random
process.Comment: 6 pages, 1 figure. To appear in PNA