Slepian and Sudakov-Fernique type inequalities, which compare expectations of
maxima of Gaussian random vectors under certain restrictions on the covariance
matrices, play an important role in probability theory, especially in empirical
process and extreme value theories. Here we give explicit comparisons of
expectations of smooth functions and distribution functions of maxima of
Gaussian random vectors without any restriction on the covariance matrices. We
also establish an anti-concentration inequality for the maximum of a Gaussian
random vector, which derives a useful upper bound on the L\'{e}vy concentration
function for the Gaussian maximum. The bound is dimension-free and applies to
vectors with arbitrary covariance matrices. This anti-concentration inequality
plays a crucial role in establishing bounds on the Kolmogorov distance between
maxima of Gaussian random vectors. These results have immediate applications in
mathematical statistics. As an example of application, we establish a
conditional multiplier central limit theorem for maxima of sums of independent
random vectors where the dimension of the vectors is possibly much larger than
the sample size.Comment: 22 pages; discussions and references update