Atomic arrays can exhibit collective light emission when the transition
wavelength exceeds their lattice spacing. Subradiant states take advantage of
this phenomenon to drastically reduce their overall decay rate, allowing for
long-lived states in dissipative open systems. We build on previous work to
investigate whether or not disorder can further decrease the decay rate of a
singly-excited atomic array. More specifically, we consider spatial disorder of
varying strengths in a 1D half waveguide and in 1D, 2D, and 3D atomic arrays in
free space and analyze the effect on the most subradiant modes. While we
confirm that the dilute half waveguide exhibits an analog of Anderson
localization, the dense half waveguide and free space systems can be understood
through the creation of close-packed, few-body subradiant states similar to
those found in the Dicke limit. In general, we find that disorder provides
little advantage in generating darker subradiant states in free space on
average and will often accelerate decay. However, one could potentially change
interatomic spacing within the array to engineer specific subradiant states.Comment: 13 pages, 12 figure