51,668 research outputs found

    Wavelet-based detection of outliers in volatility models

    Get PDF
    Outliers in financial data can lead to model parameter estimation biases, invalid inferences and poor volatility forecasts. Therefore, their detection and correction should be taken seriously when modeling financial data. This paper focuses on these issues and proposes a general detection and correction method based on wavelets that can be applied to a large class of volatility models. The effectiveness of our proposal is tested by an intensive Monte Carlo study for six well known volatility models and compared to alternative proposals in the literature, before applying it to three daily stock market indexes. The Monte Carlo experiments show that our method is both very effective in detecting isolated outliers and outlier patches and much more reliable than other wavelet-based procedures since it detects a significant smaller number of false outliers

    Robust and Sparse Regression via γ\gamma-divergence

    Full text link
    In high-dimensional data, many sparse regression methods have been proposed. However, they may not be robust against outliers. Recently, the use of density power weight has been studied for robust parameter estimation and the corresponding divergences have been discussed. One of such divergences is the γ\gamma-divergence and the robust estimator using the γ\gamma-divergence is known for having a strong robustness. In this paper, we consider the robust and sparse regression based on γ\gamma-divergence. We extend the γ\gamma-divergence to the regression problem and show that it has a strong robustness under heavy contamination even when outliers are heterogeneous. The loss function is constructed by an empirical estimate of the γ\gamma-divergence with sparse regularization and the parameter estimate is defined as the minimizer of the loss function. To obtain the robust and sparse estimate, we propose an efficient update algorithm which has a monotone decreasing property of the loss function. Particularly, we discuss a linear regression problem with L1L_1 regularization in detail. In numerical experiments and real data analyses, we see that the proposed method outperforms past robust and sparse methods.Comment: 25 page

    Log-Regularly Varying Scale Mixture of Normals for Robust Regression

    Full text link
    Linear regression with the classical normality assumption for the error distribution may lead to an undesirable posterior inference of regression coefficients due to the potential outliers. This paper considers the finite mixture of two components with thin and heavy tails as the error distribution, which has been routinely employed in applied statistics. For the heavily-tailed component, we introduce the novel class of distributions; their densities are log-regularly varying and have heavier tails than those of Cauchy distribution, yet they are expressed as a scale mixture of normal distributions and enable the efficient posterior inference by Gibbs sampler. We prove the robustness to outliers of the posterior distributions under the proposed models with a minimal set of assumptions, which justifies the use of shrinkage priors with unbounded densities for the coefficient vector in the presence of outliers. The extensive comparison with the existing methods via simulation study shows the improved performance of our model in point and interval estimation, as well as its computational efficiency. Further, we confirm the posterior robustness of our method in the empirical study with the shrinkage priors for regression coefficients.Comment: 62 page

    Robust graphical modeling of gene networks using classical and alternative T-distributions

    Full text link
    Graphical Gaussian models have proven to be useful tools for exploring network structures based on multivariate data. Applications to studies of gene expression have generated substantial interest in these models, and resulting recent progress includes the development of fitting methodology involving penalization of the likelihood function. In this paper we advocate the use of multivariate tt-distributions for more robust inference of graphs. In particular, we demonstrate that penalized likelihood inference combined with an application of the EM algorithm provides a computationally efficient approach to model selection in the tt-distribution case. We consider two versions of multivariate tt-distributions, one of which requires the use of approximation techniques. For this distribution, we describe a Markov chain Monte Carlo EM algorithm based on a Gibbs sampler as well as a simple variational approximation that makes the resulting method feasible in large problems.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS410 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On Business Cycle Asymmetries in G7 Countries

    Get PDF
    We investigate whether business cycle dynamics in seven industrialized countries (the G7) are characterized by asymmetries in conditional mean. We provide evidence on this issue using a variety of time series models. Our approach is fully parametric. Our testing strategy is robust to any conditional heteroskedasticity, outliers, and / or long memory that may be present. Our results indicate fairly strong evidence of nonlinearities in the conditional mean dynamics of the GDP growth rates for Canada, Germany, Italy, Japan, and the US. For France and the UK, the conditional mean dynamics appear to be largely linear. Our study shows that while the existence of conditional heteroskedasticity and long memory does not have much affect on testing for linearity in the conditional mean, accounting for outliers does reduce the evidence against linearity.business cycles, asymmetries, nonlinearities, conditional heteroskedasticity, long memory, outliers, real GDP, stable distributions
    corecore