Digital background calibration algorithm and its FPGA implementation for timing mismatch correction of time-interleaved ADC

Abstract

Sample time error can degrade the performance of time-interleaved analog to digital converters (TIADCs). A fully digital background algorithm is presented in this paper to estimate and correct the timing mismatch errors between four interleaved channels, together with its hardware implementation. The proposed algorithm provides low computation burden and high performance. It is based on the simplified representation of the coefficients of the Lagrange interpolator. Simulation results show that it can suppress error tones in all of the Nyquist band. Results show that, for a four-channel TIADC with 10-bit resolution, the proposed algorithm improves the signal to noise and distortion ratio (SNDR) and spurious-free dynamic range (SFDR) by 19.27 dB and 35.2 dB, respectively. This analysis was done for an input signal frequency of 0.09fs. In the case of an input signal frequency of 0.45fs, an improvement by 33.06 dB and 43.14 dB is respectively achieved in SNDR and SFDR. In addition to the simulation, the algorithm was implemented in hardware for real-time evaluation. The low computational burden of the algorithm allowed an FPGA implementation with a low logic resource usage and a high system clock speed (926.95 MHz for four channel algorithm implementation). Thus, the proposed architecture can be used as a post-processing algorithm in host processors for data acquisition systems to improve the performance of TIADC

    Similar works