69 research outputs found

    Performance of Empirical Risk Minimization for Linear Regression with Dependent Data

    Full text link
    This paper establishes bounds on the performance of empirical risk minimization for large-dimensional linear regression. We generalize existing results by allowing the data to be dependent and heavy-tailed. The analysis covers both the cases of identically and heterogeneously distributed observations. Our analysis is nonparametric in the sense that the relationship between the regressand and the regressors is not specified. The main results of this paper show that the empirical risk minimizer achieves the optimal performance (up to a logarithmic factor) in a dependent data setting

    EVALUATING THE ACCURACY OF TAIL RISK FORECASTS FOR SYSTEMIC RISK MEASUREMENT

    Get PDF
    In this paper we address how to evaluate tail risk forecasts for systemic risk measurement. We propose two loss functions, the Tail Tick Loss and the Tail Mean Square Error, to evaluate, respectively, CoVaR and MES forecasts. We then analyse CoVaR and MES forecasts for a panel of top US financial institutions between 2000 and 2012 constructed using a set of bivariate DCC-GARCH-type models. The empirical results highlight the importance of using an appropriate loss function for the evaluation of such forecasts. Among other findings, the analysis confirms that the DCC-GJR specification provides accurate predictions for both CoVaR and MES, in particular for the riskiest group of institutions in the panel (Broker-Dealers)

    Non-Standard Errors

    Get PDF
    In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty: Non-standard errors (NSEs). We study NSEs by letting 164 teams test the same hypotheses on the same data. NSEs turn out to be sizable, but smaller for better reproducible or higher rated research. Adding peer-review stages reduces NSEs. We further find that this type of uncertainty is underestimated by participants

    Detecting granular time series in large panels

    No full text
    Large economic and financial panels can include time series that influence the entire cross-section. We name such series granular. In this paper we introduce a panel data model that allows to formalize the notion of granular time series. We then propose a methodology, which is inspired by the network literature in statistics and econometrics, to detect the set of granulars when such set is unknown. The influence of the ith series in the panel is measured by the norm of the ith column of the inverse covariance matrix. We show that a detection procedure based on the column norms allows to consistently select granular series when the cross-section and time series dimensions are large. Importantly, the methodology allows to consistently detect granulars also when the series in the panel are influenced by common factors. A simulation study shows that the proposed procedures perform satisfactorily in finite samples. Our empirical study shows the granular influence of the automobile sector in US industrial production

    Nets: Network estimation for time series

    No full text
    This work proposes novel network analysis techniques for multivariate time se-ries. We define the network of a multivariate time series as a graph where vertices denote the components of the process and edges denote non–zero long run par-tial correlations. We then introduce a two step lasso procedure, called nets, to estimate high–dimensional sparse Long Run Partial Correlation networks. This ap-proach is based on a var approximation of the process and allows to decompose the long run linkages into the contribution of the dynamic and contemporaneous dependence relations of the system. The large sample properties of the estimator are analysed and we establish conditions for consistent selection and estimation of the non–zero long run partial correlations. The methodology is illustrated with an application to a panel of U.S. bluechips
    • 

    corecore