Approximating conditional density functions using dimension reduction

Abstract

We propose to approximate the conditional density function of a random variable Y given a dependent random d-vector X by that of Y given θ τ X, where the unit vector θ is selected such that the average Kullback-Leibler discrepancy distance between the two conditional density functions obtains the minimum. Our approach is nonparametric as far as the estimation of the conditional density functions is concerned. We have shown that this nonparametric estimator is asymptotically adaptive to the unknown index θ in the sense that the first order asymptotic mean squared error of the estimator is the same as that when θ was known. The proposed method is illustrated using both simulated and real-data examples

Similar works

Full text

thumbnail-image

LSE Research Online

redirect
Last time updated on 10/02/2012

This paper was published in LSE Research Online.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.