research

A modified Kolmogorov-Smirnov test for normality

Abstract

In this paper we propose an improvement of the Kolmogorov-Smirnov test for normality. In the current implementation of the Kolmogorov-Smirnov test, a sample is compared with a normal distribution where the sample mean and the sample variance are used as parameters of the distribution. We propose to select the mean and variance of the normal distribution that provide the closest fit to the data. This is like shifting and stretching the reference normal distribution so that it fits the data in the best possible way. If this shifting and stretching does not lead to an acceptable fit, the data is probably not normal. We also introduce a fast easily implementable algorithm for the proposed test. A study of the power of the proposed test indicates that the test is able to discriminate between the normal distribution and distributions such as uniform, bi-modal, beta, exponential and log-normal that are different in shape, but has a relatively lower power against the student t-distribution that is similar in shape to the normal distribution. In model settings, the former distinction is typically more important to make than the latter distinction. We demonstrate the practical significance of the proposed test with several simulated examples.Closest fit; Kolmogorov-Smirnov; Normal distribution

    Similar works