Most Gamma-Ray Bursts (GRBs) detected by the Fermi Gamma-ray Space Telescope
exhibit a delay of up to about 10 seconds between the trigger time of the hard
X-ray signal as measured by the Fermi GBM and the onset of the MeV-GeV
counterpart detected by the LAT. This delay may hint at important physics,
whether it is due to the intrinsic variability of the inner engine or it is
related to quantum dispersion effects in the velocity of light propagation from
the sources to the observer. It is critical to have a proper assessment of how
these time delays affect the overall properties of the light curves. We
cross-correlated the 5 brightest GRBs of the 1st Fermi LAT Catalog by means of
the continuous correlation function (CCF) and of the Discrete Correlation
Function (DCF). A maximum in the DCF suggests the presence of a time lag
between the curves, whose value and uncertainty are estimated through a
Gaussian fitting of the DCF profile and light curve simulation via a Monte
Carlo approach. The cross-correlation of the observed LAT and GBM light curves
yields time lags that are mostly similar to those reported in the literature,
but they are formally consistent with zero. The cross-correlation of the
simulated light curves yields smaller errors on the time lags and more than one
time lag for GRBs 090902B and 090926A; for all 5 GRBs, the time lags are
significantly different from zero and consistent with those reported in the
literature, when only the secondary maxima are considered for those two GRBs.
The DCF method evidences the presence of time lags between the LAT and GBM
light curves and underlines their complexity. While this suggests that the
delays should be ascribed to intrinsic physical mechanisms, more sensitivity
and larger statistics are needed to assess whether time lags are universally
present in the early GRB emission and which dynamical time scales they trace.Comment: 9 pages, 3 figures, accepted for publication in Astronomy &
Astrophysic