1 research outputs found
Comparing models of symbolic music using probabilistic grammars and probabilistic programming
We conduct a systematic comparison of several probabilistic
models of symbolic music, including zeroth and first order
Markov models over pitches and intervals, a hidden Markov
model over pitches, and a probabilistic context free grammar
with two parameterisations, all implemented uniformly
using a probabilistic programming language (PRISM). This
allows us to take advantage of variational Bayesian methods
for learning parameters and assessing the goodness of fit of
the models in a principled way. When applied to a corpus
of Bach chorales and the Essen folk song collection, we
show that, depending on various parameters, the probabilistic
grammars sometimes but not always out-perform the
simple Markov models. On looking for evidence of over-
fitting of complex models to small datasets, we find that
even the smallest dataset is sufficient to support the richest
parameterisation of the probabilistic grammars. However,
examining how the models perform on smaller subsets of
pieces, we find that the simpler Markov models do indeed
out-perform the best grammar-based model at the small end
of the scale