17 research outputs found
Information geometry and sufficient statistics
Information geometry provides a geometric approach to families of statistical
models. The key geometric structures are the Fisher quadratic form and the
Amari-Chentsov tensor. In statistics, the notion of sufficient statistic
expresses the criterion for passing from one model to another without loss of
information. This leads to the question how the geometric structures behave
under such sufficient statistics. While this is well studied in the finite
sample size case, in the infinite case, we encounter technical problems
concerning the appropriate topologies. Here, we introduce notions of
parametrized measure models and tensor fields on them that exhibit the right
behavior under statistical transformations. Within this framework, we can then
handle the topological issues and show that the Fisher metric and the
Amari-Chentsov tensor on statistical models in the class of symmetric 2-tensor
fields and 3-tensor fields can be uniquely (up to a constant) characterized by
their invariance under sufficient statistics, thereby achieving a full
generalization of the original result of Chentsov to infinite sample sizes.
More generally, we decompose Markov morphisms between statistical models in
terms of statistics. In particular, a monotonicity result for the Fisher
information naturally follows.Comment: 37 p, final version, minor corrections, improved presentatio