1,250 research outputs found
Entropy on Spin Factors
Recently it has been demonstrated that the Shannon entropy or the von Neuman
entropy are the only entropy functions that generate a local Bregman
divergences as long as the state space has rank 3 or higher. In this paper we
will study the properties of Bregman divergences for convex bodies of rank 2.
The two most important convex bodies of rank 2 can be identified with the bit
and the qubit. We demonstrate that if a convex body of rank 2 has a Bregman
divergence that satisfies sufficiency then the convex body is spectral and if
the Bregman divergence is monotone then the convex body has the shape of a
ball. A ball can be represented as the state space of a spin factor, which is
the most simple type of Jordan algebra. We also study the existence of recovery
maps for Bregman divergences on spin factors. In general the convex bodies of
rank 2 appear as faces of state spaces of higher rank. Therefore our results
give strong restrictions on which convex bodies could be the state space of a
physical system with a well-behaved entropy function.Comment: 30 pages, 6 figure
Topological Data Analysis with Bregman Divergences
Given a finite set in a metric space, the topological analysis generalizes
hierarchical clustering using a 1-parameter family of homology groups to
quantify connectivity in all dimensions. The connectivity is compactly
described by the persistence diagram. One limitation of the current framework
is the reliance on metric distances, whereas in many practical applications
objects are compared by non-metric dissimilarity measures. Examples are the
Kullback-Leibler divergence, which is commonly used for comparing text and
images, and the Itakura-Saito divergence, popular for speech and sound. These
are two members of the broad family of dissimilarities called Bregman
divergences.
We show that the framework of topological data analysis can be extended to
general Bregman divergences, widening the scope of possible applications. In
particular, we prove that appropriately generalized Cech and Delaunay (alpha)
complexes capture the correct homotopy type, namely that of the corresponding
union of Bregman balls. Consequently, their filtrations give the correct
persistence diagram, namely the one generated by the uniformly growing Bregman
balls. Moreover, we show that unlike the metric setting, the filtration of
Vietoris-Rips complexes may fail to approximate the persistence diagram. We
propose algorithms to compute the thus generalized Cech, Vietoris-Rips and
Delaunay complexes and experimentally test their efficiency. Lastly, we explain
their surprisingly good performance by making a connection with discrete Morse
theory
A directed isoperimetric inequality with application to Bregman near neighbor lower bounds
Bregman divergences are a class of divergences parametrized by a
convex function and include well known distance functions like
and the Kullback-Leibler divergence. There has been extensive
research on algorithms for problems like clustering and near neighbor search
with respect to Bregman divergences, in all cases, the algorithms depend not
just on the data size and dimensionality , but also on a structure
constant that depends solely on and can grow without bound
independently.
In this paper, we provide the first evidence that this dependence on
might be intrinsic. We focus on the problem of approximate near neighbor search
for Bregman divergences. We show that under the cell probe model, any
non-adaptive data structure (like locality-sensitive hashing) for
-approximate near-neighbor search that admits probes must use space
. In contrast, for LSH under the best
bound is .
Our new tool is a directed variant of the standard boolean noise operator. We
show that a generalization of the Bonami-Beckner hypercontractivity inequality
exists "in expectation" or upon restriction to certain subsets of the Hamming
cube, and that this is sufficient to prove the desired isoperimetric inequality
that we use in our data structure lower bound.
We also present a structural result reducing the Hamming cube to a Bregman
cube. This structure allows us to obtain lower bounds for problems under
Bregman divergences from their analog. In particular, we get a
(weaker) lower bound for approximate near neighbor search of the form
for an -query non-adaptive data structure,
and new cell probe lower bounds for a number of other near neighbor questions
in Bregman space.Comment: 27 page
Adaptive Mixture Methods Based on Bregman Divergences
We investigate adaptive mixture methods that linearly combine outputs of
constituent filters running in parallel to model a desired signal. We use
"Bregman divergences" and obtain certain multiplicative updates to train the
linear combination weights under an affine constraint or without any
constraints. We use unnormalized relative entropy and relative entropy to
define two different Bregman divergences that produce an unnormalized
exponentiated gradient update and a normalized exponentiated gradient update on
the mixture weights, respectively. We then carry out the mean and the
mean-square transient analysis of these adaptive algorithms when they are used
to combine outputs of constituent filters. We illustrate the accuracy of
our results and demonstrate the effectiveness of these updates for sparse
mixture systems.Comment: Submitted to Digital Signal Processing, Elsevier; IEEE.or
- …