The mean square error between recorded and reproduced signals is used as an error measure to determine the effects of time-base perturbations on an analog singal. The mean square error caused by time-base perturbations is shown to be proportional to the product of the square of the signal bandwidth and the time-base error variance for the case of low pass signal. When the signal is band pass, there is shown to be an additional error term which is proportional to the square of the signal center frequency and the time-base error variance.
Calculations are also carried out to determine the relative effects of pre-recorder and post-recorder external additive noise. It is found that this external noise adds a term to the mean square error which is approximately equal to the noise power.
An analysis is made to determine the error reduction which is possible by the use of the optimum linear filter. It is shown that a significant improvement is possible for the case where the signal bandwidth is less than the time-base error bandwidth. Practical approximations to the optimum linear filter are also considered and, in some cases, they are found to give a reduction in the mean square error which is approximately the same as that given by the optimum filter --Abstract, pages ii-iii