We propose a new approach for learning
a sparse graphical model approximation to
a specified multivariate probability distribution
(such as the empirical distribution
of sample data). The selection of sparse
graph structure arises naturally in our approach
through solution of a convex optimization
problem, which differentiates our
method from standard combinatorial approaches.
We seek the maximum entropy relaxation
(MER) within an exponential family,
which maximizes entropy subject to constraints
that marginal distributions on small
subsets of variables are close to the prescribed
marginals in relative entropy. To solve MER,
we present a modified primal-dual interior
point method that exploits sparsity of the
Fisher information matrix in models defined
on chordal graphs. This leads to a tractable,
scalable approach provided the level of relaxation
in MER is sufficient to obtain a thin
graph. The merits of our approach are investigated
by recovering the structure of some
simple graphical models from sample data