In this paper, we propose a new and unified approach for nonparametric
regression and conditional distribution learning. Our approach simultaneously
estimates a regression function and a conditional generator using a generative
learning framework, where a conditional generator is a function that can
generate samples from a conditional distribution. The main idea is to estimate
a conditional generator that satisfies the constraint that it produces a good
regression function estimator. We use deep neural networks to model the
conditional generator. Our approach can handle problems with multivariate
outcomes and covariates, and can be used to construct prediction intervals. We
provide theoretical guarantees by deriving non-asymptotic error bounds and the
distributional consistency of our approach under suitable assumptions. We also
perform numerical experiments with simulated and real data to demonstrate the
effectiveness and superiority of our approach over some existing approaches in
various scenarios.Comment: 50 pages, including appendix. 5 figures and 6 tables in the main
text. 1 figure and 7 tables in the appendi