In this paper, we further investigate the relationship, reported by Oates et al., between the optical/UV afterglow luminosity (measured at restframe 200 s) and average afterglow decay rate (measured from restframe 200 s onwards) of long duration gamma-ray bursts (GRBs). We extend the analysis by examining the X-ray light curves, finding a consistent correlation. We therefore explore how the parameters of these correlations relate to the prompt emission phase and, using a Monte Carlo simulation, explore whether these correlations are consistent with predictions of the standard afterglow model. We find significant correlations between: log LO, 200 s and log LX, 200 s; αO, >200 s and αX, >200 s, consistent with simulations. The model also predicts relationships between log Eiso and log L200 s; however, while we find such relationships in the observed sample, the slope of the linear regression is shallower than that simulated and inconsistent at ≳3σ. Simulations also do not agree with correlations observed between log L200 s and α> 200 s, or
logEiso
logEiso
and α> 200 s. Overall, these observed correlations are consistent with a common underlying physical mechanism producing GRBs and their afterglows regardless of their detailed temporal behaviour. However, a basic afterglow model has difficulty explaining all the observed correlations. This leads us to briefly discuss alternative more complex models