Pfanzagl and Wefelmeyer (1978) show that bias corrected ML estimators are higher order efficient. Their procedure however is computationally complicated because it requires integrating complicated functions over the distribution of the MLE estimator. The purpose of this paper is to show that these integrals can be replaced by sample averages without affecting the higher-order variance. We focus on bootstrap and jackknife based bias correction as a way to implement bias corrections in a nonparametric way. We find that our bootstrap and jackknife bias corrected ML estimators have the same higher order variance as the efficient estimator of Pfanzagl and Wefelmeyer. Bias corrected ML estimators are therefore higher order efficient even if the bias function is estimated from the data rather than computed analytically
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.