Many complex systems are characterized by non-Boltzmann distribution
functions of their statistical variables. If one wants to -- justified or not
-- hold on to the maximum entropy principle for complex statistical systems
(non-Boltzmann) we demonstrate how the corresponding entropy has to look like,
given the form of the corresponding distribution functions. By two natural
assumptions that (i) the maximum entropy principle should hold and that (ii)
entropy should describe the correct thermodynamics of a system (which produces
non-Boltzmann distributions) the existence of a class of fully consistent
entropies can be deduced. Classical Boltzmann-Gibbs entropy is recovered as a
special case for the observed distribution being the exponential, Tsallis
entropy is the special case for q-exponential observations.Comment: CTNEXT 07, no fig