772 research outputs found

    Low-frequency components and the Weekend effect revisited: Evidence from Spectral Analysis

    Get PDF
    We revisit the well-known weekend anomaly (Gibbons and Hess, 1981; Harris, 1986; Smirlock and Straks, 1986; Connolly, 1989; Giovanis, 2010) using an established macroeconometric technique known as spectral analysis (Granger, 1964; Sargent, 1987). Our findings show that using regression analysis with dichotomous variables, spectral analysis helps establishing the robustness of the estimated parameters based on a sample of the S&P500 for the 1972-1973 period. As further evidence of cycles in financial times series, we relate our application of spectral analysis to the recent literature on low-frequency components in asset returns (Barberis et al., 2001; Grüne and Semmler, 2008; Semmler et al., 2009). We suggest investment practitioners to consider using spectral analysis for establishing the ‘stylized facts’ of the financial time series under scrutiny and for regression models validation purposes.Spectral Analysis; Weekend Anomaly; Financial Cycles; Low-frequency Components; Asset Returns.

    "Forecasting stochastic Volatility using the Kalman filter: an application to Canadian Interest Rates and Price-Earnings Ratio"

    Get PDF
    In this paper, we aim at forecasting the stochastic volatility of key financial market variables with the Kalman filter using stochastic models developed by Taylor (1986,1994) and Nelson (1990). First, we compare a stochastic volatility model relying on the Kalman filter to the conditional volatility estimated with the GARCH model. We apply our models to Canadian short-term interest rates. When comparing the profile of the interest rate stochastic volatility to the conditional one, we find that the omission of a constant term in the stochastic volatility model might have a perverse effect leading to a scaling problem, a problem often overlooked in the literature. Stochastic volatility seems to be a better forecasting tool than GARCH(1,1) since it is less conditioned by autoregressive past information. Second, we filter the S&P500 price-earnings(P/E) ratio in order to forecast its value. To make this forecast, we postulate a rational expectations process but our method may accommodate other data generating processes. We find that our forecast is close to a GARCH(1,1) profile.Stochastic volatility, Kalman filter, P/E ratio forecast, Interest rate forecast

    Forecasting stochastic Volatility using the Kalman filter: An Application to Canadian Interest Rates and Price-Earnings Ratio

    Get PDF
    In this paper, we aim at forecasting the stochastic volatility of key financial market variables with the Kalman filter using stochastic models developed by Taylor (1986, 1994) and Nelson (1990). First, we compare a stochastic volatility model relying on the Kalman filter to the conditional volatility estimated with the GARCH model. We apply our models to Canadian short-term interest rates. When comparing the profile of the interest rate stochastic volatility to the conditional one, we find that the omission of a constant term in the stochastic volatility model might have a perverse effect leading to a scaling problem, a problem often overlooked in the literature. Stochastic volatility seems to be a better forecasting tool than GARCH(1,1) since it is less conditioned by autoregressive past information. Second, we filter the S&P500 price-earnings (P/E) ratio in order to forecast its value. To make this forecast, we postulate a rational expectations process but our method may accommodate other data generating processes. We find that our forecast is close to a GARCH(1,1) profile.Stochastic volatility; Kalman filter; P/E ratio forecast; Interest rate forecast.

    Optimal Instrumental Variables Generators Based on Improved Hausman Regression, with an Application to Hedge Funds Returns

    Get PDF
    This paper proposes new Hausman-based estimators lying on cumulants optimal instruments. Using these new generated strong instruments in a GMM setting, we obtain new GMM estimators which we call GMM-C and its homologue, the GMM-hm. This procedure improves the method of moments for identifying the parameters of a model. Also, our study gives way to a new indicator signalling the presence of specification errors in financial models. We apply our battery of tests and estimators to a sample of 22 HFR hedge fund indices observed monthly over the period 1990-2005. Our tests reveal that specification errors corrupt parameters estimation of financial models of returns. Therefore, it is not surprising that the ranking of hedge funds is very sensitive to the choice of estimators. Our new indicator of specification errors reveals itself very powerful to detect those errors.Asset Pricing Models, specification errors, Hausman test, GMM, optimal instruments.

    Risk Procyclicality and Dynamic Hedge Fund Strategies

    Get PDF
    It is well-known that traditional financial institutions like banks follow procyclical risk strategies (Rajan 2005, 2009, Shin 2009, Jacques 2010) in the sense that they increase their leverage in economic expansions and reduce it in recessions, which leads to a procyclical behaviour for their betas and other risk and financial performance measures. But it is less known that the spectrum of the returns of many hedge fund strategies displays a high volatility at business cycle frequencies. In this paper, we study this unknown stylized fact resorting to two procedures: conditional modelling and Kalman filtering of Funds alphas and betas. We find that hedge fund betas are usually procyclical. Regarding the alpha, it is often high at the beginning of a market upside cycle but as the demand pressure stems from investors, it eventually fades away, which suggests that the alpha puzzle documented in the financial literature is questionable when cast in a dynamic setting.risk measures; Aggregate risk; Financial stability; Conditional models; Kalman Filter; Spectral analysis.

    Separating decision tree complexity from subcube partition complexity

    Get PDF
    The subcube partition model of computation is at least as powerful as decision trees but no separation between these models was known. We show that there exists a function whose deterministic subcube partition complexity is asymptotically smaller than its randomized decision tree complexity, resolving an open problem of Friedgut, Kahn, and Wigderson (2002). Our lower bound is based on the information-theoretic techniques first introduced to lower bound the randomized decision tree complexity of the recursive majority function. We also show that the public-coin partition bound, the best known lower bound method for randomized decision tree complexity subsuming other general techniques such as block sensitivity, approximate degree, randomized certificate complexity, and the classical adversary bound, also lower bounds randomized subcube partition complexity. This shows that all these lower bound techniques cannot prove optimal lower bounds for randomized decision tree complexity, which answers an open question of Jain and Klauck (2010) and Jain, Lee, and Vishnoi (2014).Comment: 16 pages, 1 figur

    La IIIe République et Bismarck : le rôle des opportunistes dans le compromis franco-allemand

    Full text link
    Le concert européen est souvent perçu, du moins pour les années 1871 à 1890, comme l’œuvre indéniable du Chancelier allemand Otto von Bismarck et des grands hommes politiques de son temps. La politique dite bismarckienne a effectivement connoté la plupart des interactions entre pays rivaux de l’époque, particulièrement entre la France et l’Allemagne. Son incidence sur la politique française est telle qu’elle en affecte les politiques intérieure et extérieure. Les républicains opportunistes adopteront vis-à-vis de la politique bismarckienne une attitude pragmatique leur permettant, de 1878 à 1885, de recouvrer leur rôle d’antan, ainsi que d’encadrer un fort sentiment nationaliste. S’il est souvent reproché aux opportunistes d’avoir préféré le momentané au planifié, il nous semble que la politique des gouvernements Ferry, Gambetta, Waddington et Freycinet réussit au contraire à tirer habilement son épingle du jeu bismarckien. Familiers du caractère éphémère de ce jeu, les opportunistes ont su y trouver des avantages, assurer à la France le recouvrement d’une position diplomatique de choix et l’acquisition de nouveaux territoires coloniaux, tout en préservant une certaine indépendance face au Chancelier.The European concert is often perceived, at least from the 1871s to 1890, as the unmistakable work of the German Chancellor Otto von Bismarck and the major politicians of his time. The said Bismarckian policy effectively influenced most of the interactions between opposing parties, particularly between France and Germany. Its bearing on the French policy is such that it affects home and foreign policies. The opportunist republicans will personify this ambiguous reaction to new German giant. A pragmatic attitude allows them, between 1878 and 1885, to recover their former role as well as to guide a strong nationalist feeling. If the opportunists are often blamed for having preferred the short term, the governments of Ferry, Gambetta, Waddington and Freycinet governments managed to handle Bismarckian policy skilfully. Familiar with its ephemeral character, opportunist governments knew how to find advantages, insure that France recover its diplomatic position and acquire new colonial territories while protecting a certain independence vis-à-vis the Iron Chancellor

    A New Approach Based on Cumulants for Estimating Financial Regression Models with Errors in the Variables: the Fama and French Model Revisited

    Get PDF
    This paper proposes to revisit both the CAPM and the three-factor model of Fama and French (1993) in presence of errors in the variables. To reduce the bias induced by measurement and specification errors, we transpose to the cost of equity an estimator based on cumulants of order three and four initially developed by Dagenais and Dagenais (1997) and lated generalized to financial models by Racicot (2003). Our results show that our technique has great and significant consequences on the measure of the cost of equity. We obtain ipso facto a new estimator of the Jensen alpha.Errors in the variables, cumulants, higher moments, instrumental variables, cost of equity, Jensen alpha.

    Forecasting Irregularly Spaced UHF Financial Data: Realized Volatility vs UHF-GARCH Models

    Get PDF
    A very promising literature has been recently devoted to the modeling of ultra-high-frequency (UHF) data. Our first aim is to develop an empirical application of Autoregressive Conditional Duration GARCH models and the realized volatility to forecast future volatilities on irregularly spaced data. We also compare the out sample performances of ACD GARCH models with the realized volatility method. We propose a procedure to take into account the time deformation and show how to use these models for computing daily VaR.Realized volatility, Ultra High Frequency GARCH, time deformation, financial markets, Daily VaR.

    \u3cem\u3eWhat\u27s Past Is Prologue\u3c/em\u3e

    Get PDF
    corecore