We investigate the connection between the time-evolution of averages of
stochastic quantities and the Fisher information and its induced statistical
length. As a consequence of the Cramer-Rao bound, we find that the rate of
change of the average of any observable is bounded from above by its variance
times the temporal Fisher information. As a consequence of this bound, we
obtain a speed limit on the evolution of stochastic observables: Changing the
average of an observable requires a minimum amount of time given by the change
in the average squared, divided by the fluctuations of the observable times the
thermodynamic cost of the transformation. In particular for relaxation
dynamics, which do not depend on time explicitly, we show that the Fisher
information is a monotonically decreasing function of time and that this
minimal required time is determined by the initial preparation of the system.
We further show that the monotonicity of the Fisher information can be used to
detect hidden variables in the system and demonstrate our findings for simple
examples of continuous and discrete random processes.Comment: 25 pages, 4 figure