We consider the situation when there is a large number of series, N, each with T observations, and each series has some predictive ability for the variable of interest, y. A methodology of growing interest is to first estimate common factors from the panel of data by the method of principal components, and then augment an otherwise standard regression or forecasting equation with the estimated factors. In this paper, we show that the least squares estimates obtained from these factor augmented regressions are T consistent if T/N→0. The factor forecasts for the conditional mean are min[T,N] consistent, but the effect of ``estimated regressors' is asymptotically negligible when T/N goes to zero. We present analytical formulas for predication intervals that take into account the sampling variability of the factor estimates. These formulas are valid regardless of the magnitude of N/T, and can also be used when the factors are non-stationary. The generality of these results is made possible by a covariance matrix estimator that is robust to weak cross-section correlation and heteroskedasticity in the idiosyncratic errors. We provide a consistency proof for this CS-HAC estimator.Panel data, common factors, generated regressors, cross- section dependence, robust covariance matrix