2 research outputs found
Proximity Operators of Discrete Information Divergences
Information divergences allow one to assess how close two distributions are
from each other. Among the large panel of available measures, a special
attention has been paid to convex -divergences, such as
Kullback-Leibler, Jeffreys-Kullback, Hellinger, Chi-Square, Renyi, and
I divergences. While -divergences have been extensively
studied in convex analysis, their use in optimization problems often remains
challenging. In this regard, one of the main shortcomings of existing methods
is that the minimization of -divergences is usually performed with
respect to one of their arguments, possibly within alternating optimization
techniques. In this paper, we overcome this limitation by deriving new
closed-form expressions for the proximity operator of such two-variable
functions. This makes it possible to employ standard proximal methods for
efficiently solving a wide range of convex optimization problems involving
-divergences. In addition, we show that these proximity operators are
useful to compute the epigraphical projection of several functions of practical
interest. The proposed proximal tools are numerically validated in the context
of optimal query execution within database management systems, where the
problem of selectivity estimation plays a central role. Experiments are carried
out on small to large scale scenarios