2 research outputs found

    Minimization Problems Based on Relative α\alpha-Entropy II: Reverse Projection

    Full text link
    In part I of this two-part work, certain minimization problems based on a parametric family of relative entropies (denoted Iα\mathscr{I}_{\alpha}) were studied. Such minimizers were called forward Iα\mathscr{I}_{\alpha}-projections. Here, a complementary class of minimization problems leading to the so-called reverse Iα\mathscr{I}_{\alpha}-projections are studied. Reverse Iα\mathscr{I}_{\alpha}-projections, particularly on log-convex or power-law families, are of interest in robust estimation problems (α>1\alpha >1) and in constrained compression settings (α<1\alpha <1). Orthogonality of the power-law family with an associated linear family is first established and is then exploited to turn a reverse Iα\mathscr{I}_{\alpha}-projection into a forward Iα\mathscr{I}_{\alpha}-projection. The transformed problem is a simpler quasiconvex minimization subject to linear constraints.Comment: 20 pages; 3 figures; minor change in the title; revised manuscript. Accepted for publication in IEEE Transactions on Information Theor

    Minimization Problems Based on Relative α\alpha-Entropy I: Forward Projection

    Full text link
    Minimization problems with respect to a one-parameter family of generalized relative entropies are studied. These relative entropies, which we term relative α\alpha-entropies (denoted Iα\mathscr{I}_{\alpha}), arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative α\alpha-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimizers of these relative α\alpha-entropies on closed and convex sets are shown to exist. Such minimizations generalize the maximum R\'{e}nyi or Tsallis entropy principle. The minimizing probability distribution (termed forward Iα\mathscr{I}_{\alpha}-projection) for a linear family is shown to obey a power-law. Other results in connection with statistical inference, namely subspace transitivity and iterated projections, are also established. In a companion paper, a related minimization problem of interest in robust statistics that leads to a reverse Iα\mathscr{I}_{\alpha}-projection is studied.Comment: 24 pages; 4 figures; minor change in title; revised version. Accepted for publication in IEEE Transactions on Information Theor
    corecore