2 research outputs found

    The weighted Hellinger distance in the multivariate kernel density estimation

    Get PDF
    The kernel multivariate density estimation is an important technique to estimate the multivariate density function. In this investigation we will use Hellinger Distance as a measure of error to evaluate the estimator, we will derive the mean weighted Hellinger distance for the estimator, and we obtain the optimal bandwidth based on Hellinger distance. Also, we propose and study a new technique to select the matrix of bandwidths based on Hellinger distance, and compare the new technique with the plug-in and the least squares techniques

    Mean Hellinger Distance as an Error Criterion in Univariate and Multivariate Kernel Density Estimation

    No full text
    Ever since the pioneering work of Parzen the mean square error( MSE) and its integrated form (MISE) have been used as the error criteria in choosing the bandwidth matrix for multivariate kernel density estimation. More recently other criteria have been advocated as competitors to the MISE, such as the mean absolute error. In this study we define a weighted version of the Hellinger distance for multivariate densities and show that it has an asymptotic form, which is one-fourth the asymptotic MISE under weak smoothness conditions on the multivariate density f. In addition the proposed criteria give rise to a new data-dependent bandwidth matrix selector. The performance of the new data-dependent bandwidth matrix selector is compared with other well known bandwidth matrix selectors such as the least squared cross validation (LSCV) and the plug-in (HPI) through simulation. We derived a closed form formula for the mean Hellinger distance (MHD) in the univariate case. We also compared via simulation mean weighted Hellinger distance (MWHD) and the asymptotic MWHD, and the MISE and the asymptotic MISE for both univariate and bivariate cases for various densities and sample sizes
    corecore