1 research outputs found
Performance of a mixed Lagrange time delay estimation autoregressive (MLTDEAR) model for single-image signal- to-noise ratio estimation in scanning electron microscopy
A novel technique based on the statistical autoregressive (AR) model has recently been developed as a solution to estimate the signal-to-noise ratio (SNR) in scanning electron microscope (SEM) images. In another research study, the authors also developed an algorithm by cascading the AR model with the Lagrange time delay (LTD) estimator. This technique is named the mixed Lagrange time delay estimation autoregressive (MLTDEAR) model. In this paper, the fundamental performance limits for the problem of single-image SNR estimation as derived from the Cramer-Rao inequality is presented. We compared the experimental performances of several existing methods - the simple method, the first-order linear interpolator, the AR-based estimator as well as the MLTDEAR method - with respect to this performance bound. In a few test cases involving different images, the efficiency of the MLTDEAR single-image estimation technique proved to be significantly better than that of the other three methods. Study of the effect of different SEM setting conditions that affect the autocorrelation function curve is also discussed