The thermal X-ray spectra of several isolated neutron stars display
deviations from a pure blackbody. The accurate physical interpretation of these
spectral features bears profound implications for our understanding of the
atmospheric composition, magnetic field strength and topology, and equation of
state of dense matter. With specific details varying from source to source,
common explanations for the features have ranged from atomic transitions in the
magnetized atmospheres or condensed surface, to cyclotron lines generated in a
hot ionized layer near the surface. Here we quantitatively evaluate the X-ray
spectral distortions induced by inhomogeneous temperature distributions of the
neutron star surface. To this aim, we explore several surface temperature
distributions, we simulate their corresponding general relativistic X-ray
spectra (assuming an isotropic, blackbody emission), and fit the latter with a
single blackbody model. We find that, in some cases, the presence of a spurious
'spectral line' is required at a high significance level in order to obtain
statistically acceptable fits, with central energy and equivalent width similar
to the values typically observed. We also perform a fit to a specific object,
RX J0806.4-4123, finding several surface temperature distributions able to
model the observed spectrum. The explored effect is unlikely to work in all
sources with detected lines, but in some cases it can indeed be responsible for
the appearance of such lines. Our results enforce the idea that surface
temperature anisotropy can be an important factor that should be considered and
explored also in combination with more sophisticated emission models like
atmospheres.Comment: 11 pages, 7 figures; accepted for publication in MNRA