Skip to main content
Article thumbnail
Location of Repository

Approximating conditional density functions using dimension reduction

By Jian-qing Fan, Liang Peng, Qiwei Yao and Wenyang Zhang


We propose to approximate the conditional density function of a random variable Y given a dependent random d-vector X by that of Y given θ τ X, where the unit vector θ is selected such that the average Kullback-Leibler discrepancy distance between the two conditional density functions obtains the minimum. Our approach is nonparametric as far as the estimation of the conditional density functions is concerned. We have shown that this nonparametric estimator is asymptotically adaptive to the unknown index θ in the sense that the first order asymptotic mean squared error of the estimator is the same as that when θ was known. The proposed method is illustrated using both simulated and real-data examples

Topics: QA Mathematics
Publisher: Springer
Year: 2009
DOI identifier: 10.1007/s10255-008-8815-1
OAI identifier:
Provided by: LSE Research Online
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.