Three laws of information theory have been proposed. Labeling by introducing
nonsymmetry and formatting by introducing symmetry are defined. The function L
(L=lnw, w is the number of microstates, or the sum of entropy and information,
L=S+I) of the universe is a constant (the first law of information theory). The
entropy S of the universe tends toward a maximum (the second law law of
information theory). For a perfect symmetric static structure, the information
is zero and the static entropy is the maximum (the third law law of information
theory). Based on the Gibbs inequality and the second law of the revised
information theory we have proved the similarity principle (a continuous higher
similarity-higher entropy relation after the rejection of the Gibbs paradox)
and proved the Curie-Rosen symmetry principle (a higher symmetry-higher
stability relation) as a special case of the similarity principle. Some
examples in chemical physics have been given. Spontaneous processes of all
kinds of molecular interaction, phase separation and phase transition,
including symmetry breaking and the densest molecular packing and
crystallization, are all driven by information minimization or symmetry
maximization. The evolution of the universe in general and evolution of life in
particular can be quantitatively considered as a series of symmetry breaking
processes. The two empirical rules - similarity rule and complementarity rule -
have been given a theoretical foundation. All kinds of periodicity in space and
time are symmetries and contribute to the stability. Symmetry is beautiful
because it renders stability. However, symmetry is in principle ugly because it
is associated with information loss.Comment: 29 pages, 14 figure