325 research outputs found
Bayesian Learning of Sum-Product Networks
Sum-product networks (SPNs) are flexible density estimators and have received
significant attention due to their attractive inference properties. While
parameter learning in SPNs is well developed, structure learning leaves
something to be desired: Even though there is a plethora of SPN structure
learners, most of them are somewhat ad-hoc and based on intuition rather than a
clear learning principle. In this paper, we introduce a well-principled
Bayesian framework for SPN structure learning. First, we decompose the problem
into i) laying out a computational graph, and ii) learning the so-called scope
function over the graph. The first is rather unproblematic and akin to neural
network architecture validation. The second represents the effective structure
of the SPN and needs to respect the usual structural constraints in SPN, i.e.
completeness and decomposability. While representing and learning the scope
function is somewhat involved in general, in this paper, we propose a natural
parametrisation for an important and widely used special case of SPNs. These
structural parameters are incorporated into a Bayesian model, such that
simultaneous structure and parameter learning is cast into monolithic Bayesian
posterior inference. In various experiments, our Bayesian SPNs often improve
test likelihoods over greedy SPN learners. Further, since the Bayesian
framework protects against overfitting, we can evaluate hyper-parameters
directly on the Bayesian model score, waiving the need for a separate
validation set, which is especially beneficial in low data regimes. Bayesian
SPNs can be applied to heterogeneous domains and can easily be extended to
nonparametric formulations. Moreover, our Bayesian approach is the first, which
consistently and robustly learns SPN structures under missing data.Comment: NeurIPS 2019; See conference page for supplemen
Constitutional Aspect of the Debate over Jay's Treaty
This study is intended to provide an in-depth account of the constitutional questions debated during the controversy over Jay's Treaty in 1795-1796. The primary objective is to show the significance of this aspect of the struggle over the treaty. The author feels that previous discussions on this formative event in the development of the American party system have tended to overlook the vital role of this split over constitutional interpretation that was involved. The general feeling shared by Federalists and Republicans that the other side were "anarchists" and "monarchists" was greatly enhanced during this debate when each thought that the other sought the destruction of the Constitution. � From this, the a.uthor would make a further conjecture that this event, by the polarization resulting from its bitter atmosphere, tended to have a stagnating effect on the operation of a mature two-party system. Finally, it will hopefully be shown here that the Federalists considered themselves to be as much "republicans" as the Jeffersonian Republicans.Histor
Learning Deep Mixtures of Gaussian Process Experts Using Sum-Product Networks
While Gaussian processes (GPs) are the method of choice for regression tasks,
they also come with practical difficulties, as inference cost scales cubic in
time and quadratic in memory. In this paper, we introduce a natural and
expressive way to tackle these problems, by incorporating GPs in sum-product
networks (SPNs), a recently proposed tractable probabilistic model allowing
exact and efficient inference. In particular, by using GPs as leaves of an SPN
we obtain a novel flexible prior over functions, which implicitly represents an
exponentially large mixture of local GPs. Exact and efficient posterior
inference in this model can be done in a natural interplay of the inference
mechanisms in GPs and SPNs. Thereby, each GP is -- similarly as in a mixture of
experts approach -- responsible only for a subset of data points, which
effectively reduces inference cost in a divide and conquer fashion. We show
that integrating GPs into the SPN framework leads to a promising probabilistic
regression model which is: (1) computational and memory efficient, (2) allows
efficient and exact posterior inference, (3) is flexible enough to mix
different kernel functions, and (4) naturally accounts for non-stationarities
in time series. In a variate of experiments, we show that the SPN-GP model can
learn input dependent parameters and hyper-parameters and is on par with or
outperforms the traditional GPs as well as state of the art approximations on
real-world data
- …