This paper extends some geometric properties of a one-parameter family of
relative entropies. These arise as redundancies when cumulants of compressed
lengths are considered instead of expected compressed lengths. These parametric
relative entropies are a generalization of the Kullback-Leibler divergence.
They satisfy the Pythagorean property and behave like squared distances. This
property, which was known for finite alphabet spaces, is now extended for
general measure spaces. Existence of projections onto convex and certain closed
sets is also established. Our results may have applications in the R\'enyi
entropy maximization rule of statistical physics.Comment: 7 pages, Prop. 5 modified, in Proceedings of the 2011 IEEE
International Symposium on Information Theor