research

Information-Theoretic Distribution Test with Application to Normality

Abstract

We derive general distribution tests based on the method of Maximum Entropy density. The proposed tests are derived from maximizing the differential entropy subject to moment constraints. By exploiting the equivalence between the Maximum Entropy and Maximum Likelihood estimates of the general exponential family, we can use the conventional Likelihood Ratio, Wald and Lagrange Multiplier testing principles in the maximum entropy framework. In particular, we use the Lagrange Multiplier method to derive tests for normality and their asymptotic properties. Monte Carlo evidence suggests that the proposed tests have desirable small sample properties and often outperform commonly used tests such as the Jarque-Bera test and the Komogorov-Smirnov-Lillie test for normality. We show that the proposed tests can be extended to tests based on regression residuals and non-iid data in a straightforward manner. We apply the proposed tests to the residuals from a stochastic production frontier model and reject the normality hypothesis.

    Similar works