It has recently been shown that in some DNA microarrays the time needed to
reach thermal equilibrium may largely exceed the typical experimental time,
which is about 15h in standard protocols (Hooyberghs et al. Phys. Rev. E 81,
012901 (2010)). In this paper we discuss how this breakdown of thermodynamic
equilibrium could be detected in microarray experiments without resorting to
real time hybridization data, which are difficult to implement in standard
experimental conditions. The method is based on the analysis of the
distribution of fluorescence intensities I from different spots for probes
carrying base mismatches. In thermal equilibrium and at sufficiently low
concentrations, log I is expected to be linearly related to the hybridization
free energy ΔG with a slope equal to 1/RTexp, where Texp is
the experimental temperature and R is the gas constant. The breakdown of
equilibrium results in the deviation from this law. A model for hybridization
kinetics explaining the observed experimental behavior is discussed, the
so-called 3-state model. It predicts that deviations from equilibrium yield a
proportionality of logI to ΔG/RTeff. Here, Teff is an
effective temperature, higher than the experimental one. This behavior is
indeed observed in some experiments on Agilent arrays. We analyze experimental
data from two other microarray platforms and discuss, on the basis of the
results, the attainment of equilibrium in these cases. Interestingly, the same
3-state model predicts a (dynamical) saturation of the signal at values below
the expected one at equilibrium.Comment: 27 pages, 9 figures, 1 tabl