The Dirichlet distribution is a statistical model that is deeply entrenched in the theory and practice of learning and reasoning in Bayesian networks. While it is mathematically convenient in certain situations, it imposes computational challenges in others, such as in Bayesian parameter learning under incomplete data. We consider in this paper a discretized variant of the continuous Dirichlet distribution. We show first how one can model such a distribution compactly as a Bayesian network, which can be used as a submodel in other Bayesian networks. We analyze some of its theoretical properties, relating the discrete variant to the original continous density. We further show how to represent and perform exact inference efficiently in this model. We finally discuss some implications that a discrete Dirichlet model may have in enabling the design of more sophisticated models, and in enabling new ways to reason about them.