314 research outputs found

    The Tracy--Widom limit for the largest eigenvalues of singular complex Wishart matrices

    Full text link
    This paper extends the work of El Karoui [Ann. Probab. 35 (2007) 663--714] which finds the Tracy--Widom limit for the largest eigenvalue of a nonsingular pp-dimensional complex Wishart matrix WC(Ī©p,n)W_{\mathbb{C}}(\Omega_p,n) to the case of several of the largest eigenvalues of the possibly singular (n<p)(n<p) matrix WC(Ī©p,n).W_{\mathbb{C}}(\Omega_p,n). As a byproduct, we extend all results of Baik, Ben Arous and Peche [Ann. Probab. 33 (2005) 1643--1697] to the singular Wishart matrix case. We apply our findings to obtain a 95% confidence set for the number of common risk factors in excess stock returns.Comment: Published in at http://dx.doi.org/10.1214/07-AAP454 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Minimax Analysis of Monetary Policy Under Model Uncertainty

    Get PDF
    Recently there have been several studies that examined monetary policy under model uncertainty. These studies formulated uncertainty in a number of different ways. One of the prominent ways to formulate model uncertainty is to form a non-parametric set of perturbations around some nominal model where the set is structured so that the uncertainty is focused on potentially important weaknesses of the model. Unfortunately, previous efforts were unable to compute exact optimal policy rules under this general formulation of uncertainty. Moreover, for those special cases when the robust rules were computed, the degree of their aggressiveness was often counterintuitive in light of conventional Brainard/Bayesian wisdom that policy under uncertainty should be conservative. This paper,therefore, consists of three different exercises concerning minimax analysis of policy rules under model uncertainty. First, the minimax approach is compared with the Bayesian one in a stylized Brainard (1967) setting. Strong similarities between recommendations of the two approaches are found. Next, a more realistic setting such as in Onatski and Stock (1999) is considered. A characterization of the worst possible models corresponding to the max part of the minimax scheme is given. It is shown that the worst possible models for very aggressive rules, such as the H-infinity rule, have realistic economic structure whereas those for passive rules, such as the actual Fed's policy, are not plausible. Thus, the results of minimax analysis presented in Onatski and Stock (1999) might be biased against the passive rules. Finally, exact optimal minimax policy rules for the case of slowly time-varying uncertainty in the case of the Rudebusch and Svensson's (1998) model are computed. The optimal rule under certainty turns out to be robust to moderate deviations from Rudebusch and Svensson's model.

    Dynamics of Interest Rate Curve by Functional Auto-regression

    Get PDF
    The paper applies methods of functional data analysis ā€“ functional auto-regression, principal components and canonical correlations ā€“ to the study of the dynamics of interest rate curve. In addition, it introduces a novel statistical tool based on the singular value decomposition of the functional cross-covariance operator. This tool is better suited for prediction purposes as opposed to either principal components or canonical correlations. Based on this tool, the paper provides a consistent method for estimating the functional auto-regression of interest rate curve. The theory is applied to estimating dynamics of Eurodollar futures rates. The results suggest that future movements of interest rates are predictable only at very short and very long horizonsFunctional auto-regression, term structure dynamics, principal components, canonical correlations, singular value decomposition

    Dynamics of Interest Rate Curve by Functional Auto-Regression

    Get PDF
    The paper uses functional auto-regression to predict the dynamics of interest rate curve. It estimates the auto-regressive operator by extending methods of the reduced-rank auto-regression to the functional data. Such an estimation technique is better suited for prediction purposes as opposed to the methods based either on principal components or canonical correlations. The consistency of the estimator is proved using methods of operator theory. The estimation method is used to analyze dynamics of Eurodollar futures rates. The results suggest that future movements of interest rates are predictable at 1-year horizons.functional data analysis, term structure, principal components, canonical correlations, singular value decomposition

    "Set Coverage and Robust Policy"

    Get PDF
    We show that conƂĀÆdence regions covering the identified set may be preferable to conƂĀÆdence regions covering each of its points in robust control applications.

    Modeling model uncertainty

    Get PDF
    Recently there has been much interest in studying monetary policy under model uncertainty. We develop methods to analyze different sources of uncertainty in one coherent structure useful for policy decisions. We show how to estimate the size of the uncertainty based on time series data, and incorporate this uncertainty in policy optimization. We propose two different approaches to modeling model uncertainty. The first is model error modeling, which imposes additional structure on the errors of an estimated model, and builds a statistical description of the uncertainty around a model. The second is set membership identification, which uses a deterministic approach to find a set of models consistent with data and prior assumptions. The center of this set becomes a benchmark model, and the radius measures model uncertainty. Using both approaches, we compute the robust monetary policy under different model uncertainty specifications in a small model of the US economy. JEL Classification: E52, C32, D81estimation, Model uncertainty, monetary policy

    Modeling Model Uncertainty

    Get PDF
    Recently there has been a great deal of interest in studying monetary policy under model uncertainty. We point out that different assumptions about the uncertainty may result in drastically different robust' policy recommendations. Therefore, we develop new methods to analyze uncertainty about the parameters of a model, the lag specification, the serial correlation of shocks, and the effects of real time data in one coherent structure. We consider both parametric and nonparametric specifications of this structure and use them to estimate the uncertainty in a small model of the US economy. We then use our estimates to compute robust Bayesian and minimax monetary policy rules, which are designed to perform well in the face of uncertainty. Our results suggest that the aggressiveness recently found in robust policy rules is likely to be caused by overemphasizing uncertainty about economic dynamics at low frequencies.
    • ā€¦
    corecore