12,335 research outputs found

    On Geometric Ergodicity of Skewed - SVCHARME models

    Full text link
    Markov Chain Monte Carlo is repeatedly used to analyze the properties of intractable distributions in a convenient way. In this paper we derive conditions for geometric ergodicity of a general class of nonparametric stochastic volatility models with skewness driven by hidden Markov Chain with switching

    Dynamics and sparsity in latent threshold factor models: A study in multivariate EEG signal processing

    Full text link
    We discuss Bayesian analysis of multivariate time series with dynamic factor models that exploit time-adaptive sparsity in model parametrizations via the latent threshold approach. One central focus is on the transfer responses of multiple interrelated series to underlying, dynamic latent factor processes. Structured priors on model hyper-parameters are key to the efficacy of dynamic latent thresholding, and MCMC-based computation enables model fitting and analysis. A detailed case study of electroencephalographic (EEG) data from experimental psychiatry highlights the use of latent threshold extensions of time-varying vector autoregressive and factor models. This study explores a class of dynamic transfer response factor models, extending prior Bayesian modeling of multiple EEG series and highlighting the practical utility of the latent thresholding concept in multivariate, non-stationary time series analysis.Comment: 27 pages, 13 figures, link to external web site for supplementary animated figure

    Bayesian Hypothesis Testing in Latent Variable Models

    Get PDF
    Hypothesis testing using Bayes factors (BFs) is known not to be well defined under the improper prior. In the context of latent variable models, an additional problem with BFs is that they are difficult to compute. In this paper, a new Bayesian method, based on decision theory and the EM algorithm, is introduced to test a point hypothesis in latent variable models. The new statistic is a by-product of the Bayesian MCMC output and, hence, easy to compute. It is shown that the new statistic is easy to interpret and appropriately defined under improper priors because the method employs a continuous loss function. The method is illustrated using a one-factor asset pricing model and a stochastic volatility model with jumps

    Semi-parametric estimation of joint large movements of risky assets

    Get PDF
    The classical approach to modelling the occurrence of joint large movements of asset returns is to assume multivariate normality for the distribution of asset returns. This implies independence between large returns. However, it is now recognised by both academics and practitioners that large movements of assets returns do not occur independently. This fact encourages the modelling joint large movements of asset returns as non-normal, a non trivial task mainly due to the natural scarcity of such extreme events. This paper shows how to estimate the probability of joint large movements of asset prices using a semi-parametric approach borrowed from extreme value theory (EVT). It helps to understand the contribution of individual assets to large portfolio losses in terms of joint large movements. The advantages of this approach are that it does not require the assumption of a specific parametric form for the dependence structure of the joint large movements, avoiding the model misspecification; it addresses specifically the scarcity of data which is a problem for the reliable fitting of fully parametric models; and it is applicable to portfolios of many assets: there is no dimension explosion. The paper includes an empirical analysis of international equity data showing how to implement semi-parametric EVT modelling and how to exploit its strengths to help understand the probability of joint large movements. We estimate the probability of joint large losses in a portfolio composed of the FTSE 100, Nikkei 250 and S&P 500 indices. Each of the index returns is found to be heavy tailed. The S&P 500 index has a much stronger effect on large portfolio losses than the FTSE 100, although having similar univariate tail heaviness

    Forecasting Value-at-Risk Using Block Structure Multivariate Stochastic Volatility Models

    Get PDF
    Most multivariate variance or volatility models suffer from a common problem, the “curse of dimensionality”. For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose was to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets is quite large. We contribute to this strand of the literature proposing a block-type parameterization for multivariate stochastic volatility models. The empirical analysis on stock returns on US market shows that 1% and 5 % Value-at-Risk thresholds based on one-step-ahead forecasts of covariances by the new specification are satisfactory for the period includes the global financial crisis.block structures; multivariate stochastic volatility; curse of dimensionality; leverage effects; multi-factors; heavy-tailed distribution

    Volatility forecasting

    Get PDF
    Volatility has been one of the most active and successful areas of research in time series econometrics and economic forecasting in recent decades. This chapter provides a selective survey of the most important theoretical developments and empirical insights to emerge from this burgeoning literature, with a distinct focus on forecasting applications. Volatility is inherently latent, and Section 1 begins with a brief intuitive account of various key volatility concepts. Section 2 then discusses a series of different economic situations in which volatility plays a crucial role, ranging from the use of volatility forecasts in portfolio allocation to density forecasting in risk management. Sections 3, 4 and 5 present a variety of alternative procedures for univariate volatility modeling and forecasting based on the GARCH, stochastic volatility and realized volatility paradigms, respectively. Section 6 extends the discussion to the multivariate problem of forecasting conditional covariances and correlations, and Section 7 discusses volatility forecast evaluation methods in both univariate and multivariate cases. Section 8 concludes briefly. JEL Klassifikation: C10, C53, G1

    Forecasting Value-at-Risk Using Block Structure Multivariate Stochastic Volatility Models

    Get PDF
    Most multivariate variance or volatility models suffer from a common problem, the “curse of dimensionality”. For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose was to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets is quite large. We contribute to this strand of the literature proposing a block-type parameterization for multivariate stochastic volatility models. The empirical analysis on stock returns on US market shows that 1% and 5 % Value-at-Risk thresholds based on one-step-ahead forecasts of covariances by the new specification are satisfactory for the period includes the global financial crisis.block structures; multivariate stochastic volatility; curse of dimensionality; leverage effects; multi-factors; heavy-tailed distribution.

    The Ten Commandments for Optimizing Value-at-Risk and Daily Capital Charges

    Get PDF
    Credit risk is the most important type of risk in terms of monetary value. Another key risk measure is market risk, which is concerned with stocks and bonds, and related financial derivatives, as well as exchange rates and interest rates. This paper is concerned with market risk management and monitoring under the Basel II Accord, and presents Ten Commandments for optimizing Value-at-Risk (VaR) and daily capital charges, based on choosing wisely from: (1) conditional, stochastic and realized volatility; (2) symmetry, asymmetry and leverage; (3) dynamic correlations and dynamic covariances; (4) single index and portfolio models; (5) parametric, semiparametric and nonparametric models; (6) estimation, simulation and calibration of parameters; (7) assumptions, regularity conditions and statistical properties; (8) accuracy in calculating moments and forecasts; (9) optimizing threshold violations and economic benefits; and (10) optimizing private and public benefits of risk management. For practical purposes, it is found that the Basel II Accord would seem to encourage excessive risk taking at the expense of providing accurate measures and forecasts of risk and VaR.Daily capital charges, Excessive risk taking Market risk, Risk management, Value-at-risk, Violations.
    corecore