A fundamental limit to the sensitivity of optical interferometers is imposed
by Brownian thermal fluctuations of the mirrors' surfaces. This thermal noise
can be reduced by using larger beams which "average out" the random
fluctuations of the surfaces. It has been proposed previously that wider,
higher-order Laguerre-Gaussian modes can be used to exploit this effect. In
this article, we show that susceptibility to spatial imperfections of the
mirrors' surfaces limits the effectiveness of this approach in interferometers
used for gravitational-wave detection. Possible methods of reducing this
susceptibility are also discussed.Comment: 10 pages, 11 figure