5 research outputs found

    Master index

    Get PDF
    Pla general, del mural cerĂ mic que decora una de les parets del vestĂ­bul de la Facultat de QuĂ­mica de la UB. El mural representa diversos sĂ­mbols relacionats amb la quĂ­mica

    Information Processing Equalities and the Information-Risk Bridge

    Full text link
    We introduce two new classes of measures of information for statistical experiments which generalise and subsume ϕ\phi-divergences, integral probability metrics, N\mathfrak{N}-distances (MMD), and (f,Γ)(f,\Gamma) divergences between two or more distributions. This enables us to derive a simple geometrical relationship between measures of information and the Bayes risk of a statistical decision problem, thus extending the variational ϕ\phi-divergence representation to multiple distributions in an entirely symmetric manner. The new families of divergence are closed under the action of Markov operators which yields an information processing equality which is a refinement and generalisation of the classical data processing inequality. This equality gives insight into the significance of the choice of the hypothesis class in classical risk minimization.Comment: 48 pages; corrected some typos and added a few additional explanation

    Loss functions, complexities, and the Legendre transformation

    Get PDF
    The paper introduces a way of re-constructing a loss function from predictive complexity. We show that a loss function and expectations of the corresponding predictive complexity w.r.t. the Bernoulli distribution are related through the Legendre transformation. It is shown that if two loss functions specify the same complexity then they are equivalent in a strong sense. The expectations are also related to the so called generalised entropy
    corecore