Asymptotic estimate of probability of misclassification for discriminant rules based on density estimates

Abstract

Let X1,..., X1 and Y1,..., Yn be independent random samples from the distribution functions (d.f.) F and G respectively. Assume that F' = f and G' = g. The discriminant rule for classifying and independently sampled observation Z to F if and to G, otherwise where l and n are the estimates of f and g respectively based on a common kernel function and the training X- and Y-samples, are considered optimal in some sense. Let Pf denote the probability measure under the assumption that Z ~ F and set P0 = Pf(f(Z) > g(Z)) and . In this article we have derived the rate at which PN --> P0 as N = l + n --> [infinity], for the situation where l = n, F(x) = TM(x - [theta]2) and G(x) = M(x - [theta]1) for some symmetric d.f. M and parameters [theta]1, [theta]2. We have examined a few special cases of M and have established that the rate of convergence of PN to P0 depends critically on the tail behavior of m = M'.optimal classification rule probability of misclassification kernel function density estimates

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 06/07/2012