1 research outputs found
The Bregman chord divergence
Distances are fundamental primitives whose choice significantly impacts the
performances of algorithms in machine learning and signal processing. However
selecting the most appropriate distance for a given task is an endeavor.
Instead of testing one by one the entries of an ever-expanding dictionary of
{\em ad hoc} distances, one rather prefers to consider parametric classes of
distances that are exhaustively characterized by axioms derived from first
principles. Bregman divergences are such a class. However fine-tuning a Bregman
divergence is delicate since it requires to smoothly adjust a functional
generator. In this work, we propose an extension of Bregman divergences called
the Bregman chord divergences. This new class of distances does not require
gradient calculations, uses two scalar parameters that can be easily tailored
in applications, and generalizes asymptotically Bregman divergences.Comment: 10 page