We investigate the form and evolution of the X-ray luminosity-temperature
(LT) relation of a sample of 114 galaxy clusters observed with Chandra at
0.1<z<1.3. The clusters were divided into subsamples based on their X-ray
morphology or whether they host strong cool cores. We find that when the core
regions are excluded, the most relaxed clusters (or those with the strongest
cool cores) follow an LT relation with a slope that agrees well with simple
self-similar expectations. This is supported by an analysis of the gas density
profiles of the systems, which shows self-similar behaviour of the gas profiles
of the relaxed clusters outside the core regions. By comparing our data with
clusters in the REXCESS sample, which extends to lower masses, we find evidence
that the self-similar behaviour of even the most relaxed clusters breaks at
around 3.5keV. By contrast, the LT slopes of the subsamples of unrelaxed
systems (or those without strong cool cores) are significantly steeper than the
self-similar model, with lower mass systems appearing less luminous and higher
mass systems appearing more luminous than the self-similar relation. We argue
that these results are consistent with a model of non-gravitational energy
input in clusters that combines central heating with entropy enhancements from
merger shocks. Such enhancements could extend the impact of central energy
input to larger radii in unrelaxed clusters, as suggested by our data. We also
examine the evolution of the LT relation, and find that while the data appear
inconsistent with simple self-similar evolution, the differences can be
plausibly explained by selection bias, and thus we find no reason to rule out
self-similar evolution. We show that the fraction of cool core clusters in our
(non-representative) sample decreases at z>0.5 and discuss the effect of this
on measurements of the evolution in the LT relation.Comment: 21 pages, 15 figures. Submitted to MNRAS. Comments welcom