An important theme in recent work in asymptotic geometric analysis is that
many classical implications between different types of geometric or functional
inequalities can be reversed in the presence of convexity assumptions. In this
note, we explore the extent to which different notions of distance between
probability measures are comparable for log-concave distributions. Our results
imply that weak convergence of isotropic log-concave distributions is
equivalent to convergence in total variation, and is further equivalent to
convergence in relative entropy when the limit measure is Gaussian.Comment: v3: Minor tweak in exposition. To appear in GAFA seminar note