We study statistical properties of the number of large earthquakes over the
past century. We analyze the cumulative distribution of the number of
earthquakes with magnitude larger than threshold M in time interval T, and
quantify the statistical significance of these results by simulating a large
number of synthetic random catalogs. We find that in general, the earthquake
record cannot be distinguished from a process that is random in time. This
conclusion holds whether aftershocks are removed or not, except at magnitudes
below M = 7.3. At long time intervals (T = 2-5 years), we find that
statistically significant clustering is present in the catalog for lower
magnitude thresholds (M = 7-7.2). However, this clustering is due to a large
number of earthquakes on record in the early part of the 20th century, when
magnitudes are less certain.Comment: 5 pages, 5 figure