Each LHC experiment will produce datasets with sizes of order one petabyte
per year. All of this data must be stored, processed, transferred, simulated
and analyzed, which requires a computing system of a larger scale than ever
mounted for any particle physics experiment, and possibly for any enterprise in
the world. I discuss how CMS has chosen to address these challenges, focusing
on recent tests of the system that demonstrate the experiment's readiness for
producing physics results with the first LHC data.Comment: To be published in the proceedings of DPF-2009, Detroit, MI, July
2009, eConf C09072