A quantitative theory of spectral lags for γ-ray bursts (GRBs) is
given. The underlying hypothesis is that GRB subpulses are photons that are
scattered into our line of sight by local concentrations of baryons that are
accelerated by radiation pressure. For primary spectra that are power laws with
exponential cutoffs, the width of the pulse and its fast rise, slow decay
asymmetry is found to increase with decreasing photon energy, and the width
near the exponential cutoff scales approximately as Eph−η, with
η∼0.4, as observed. The spectral lag time is naturally inversely
proportional to luminosity, all else being equal, also as observed.Comment: 2 figure