626 research outputs found

    Semi-Parametric Indirect Inference

    Get PDF
    We develop in this paper a generalization of the Indirect Inference (II) to semi-parametric settings and termed Semi-parametric Indirect Inference (SII). We introduce a new notion of Partial Encompassing which lays the emphasis on Pseudo True Values of Interest. The main difference with the older notion of encompassing is that some components of the pseudo-true value of interest associated with the structural parameters do correspond to true unknown values. This enables us to produce a theory of robust estimation despite mis-specifications in the structural model being used as a simulator. We also provide the asymptotic probability distributions of our SII estimators as well as Wald Encompassing Tests (WET) and advocate the use of Hausman type tests on the required assumptions for the consistency of the SII estimators. We illustrate our theory with examples based on semi-parametric stochastic volatility models.Indirect inference, partial encompassing, pseudo-true value of interest, structural models, instrumental models, Wald encompassing tests.

    On Portfolio Separation Theorems with Heterogeneous Beliefs and Attitudes towards Risk

    Get PDF
    The early work of Tobin (1958) showed that portfolio allocation decisions can be reduced to a two stage process: first decide the relative allocation of assets across the risky assets, and second decide how to divide total wealth between the risky assets and the safe asset. This so called twofund separation relies on special assumptions on either returns or preferences. Tobin (1958) analyzed portfolio demand in a mean-variance setting. We revisit the fund separation in settings that allow not only for heterogeneity of preferences for higher order moments, but also for heterogeneity of beliefs among agents. To handle the various sources of heterogeneity, beliefs, and preferences, we follow the framework of Samuelson (1970) and its recent generalization by Chabi-Yo, Leisen, and Renault (2006). This generic approach allows us to derive, for risks that are infinitely small, optimal shares of wealth invested in each security that coincide with those of a Mean-Variance-Skewness-Kurtosis optimizing agent. Besides the standard Sharpe-Lintner CAPM mutual fund separation we obtain additional mutual funds called beliefs portfolios, pertaining to heterogeneity of beliefs, a skewness portfolio similar to Kraus and Litzenberger (1976), beliefs about skewness portfolios with design quite similar to beliefs portfolios, a kurtosis portfolio, and finally portfolio heterogeneity of the preferences for skewness across investors in the economy as well as its covariation with heterogeneity of beliefs. These last two mutual funds are called cross-co-skewness portfolio and cross-co-skewness-beliefs portfolios. Under various circumstances related to return distribution characteristics, cross-agent heterogeneity and market incompleteness, some of these portfolios disappear.Financial markets; Market structure and pricing

    The Stochastic Discount Factor: Extending the Volatility Bound and a New Approach to Portfolio Selection with Higher-Order Moments

    Get PDF
    The authors extend the well-known Hansen and Jagannathan (HJ) volatility bound. HJ characterize the lower bound on the volatility of any admissible stochastic discount factor (SDF) that prices correctly a set of primitive asset returns. The authors characterize this lower bound for any admissible SDF that prices correctly both primitive asset returns and quadratic payoffs of the same primitive assets. In particular, they aim at pricing derivatives whose payoffs are defined as non-linear functions of the underlying asset payoffs. The authors construct a new volatility surface frontier in a three-dimensional space by considering not only the expected asset payoffs and variances, but also asset skewness. The intuition behind the authors' portfolio selection is motivated by the duality between the HJ mean-variance frontier and the Markowitz mean-variance portfolio frontier. The authors' approach consists of minimizing the portfolio risk subject not only to portfolio cost and expected return, as usual, but also subject to an additional constraint that depends on the portfolio skewness. In this sense, the authors shed light on portfolio selection when asset returns exhibit skewness.Financial markets; Market structure and pricing

    State Dependence in Fundamentals and Preferences Explains Risk-Aversion Puzzle

    Get PDF
    The authors examine the ability of economic models with regime shifts to rationalize and explain the risk-aversion and pricing-kernel puzzles put forward in Jackwerth (2000). They build an economy where investors' preferences or economic fundamentals are state-dependent, and simulate prices for a market index and European options on that index. Based on the original nonparametric methodology, the risk-aversion and pricing-kernel functions obtained across wealth states with these artificial data exhibit the same puzzles found with the actual data, but within each regime the puzzles disappear. This suggests that state dependence potentially explains the puzzles.Financial markets; Market structure and pricing

    Implications of Asymmetry Risk for Portfolio Analysis and Asset Pricing

    Get PDF
    Asymmetric shocks are common in markets; securities' payoffs are not normally distributed and exhibit skewness. This paper studies the portfolio holdings of heterogeneous agents with preferences over mean, variance and skewness, and derives equilibrium prices. A three funds separation theorem holds, adding a skewness portfolio to the market portfolio; the pricing kernel depends linearly only on the market return and its squared value. Our analysis extends Harvey and Siddique's (2000) conditional mean-variance-skewness asset pricing model to non-vanishing risk-neutral market variance. The empirical relevance of this extension is documented in the context of the asymmetric GARCH-in-mean model of Bekaert and Liu (2004).Financial markets; Market structure and pricing

    Efficient Minimum Distance Estimation with Multiple Rates of Convergence

    Get PDF
    This paper extends the asymptotic theory of GMM inference to allow sample counterparts of the estimating equations to converge at (multiple) rates, different from the usual square-root of the sample size. In this setting, we provide consistent estimation of the structural parameters. In addition, we define a convenient rotation in the parameter space (or reparametrization) to disentangle the different rates of convergence. More precisely, we identify special linear combinations of the structural parameters associated with a specific rate of convergence. Finally, we demonstrate the validity of usual inference procedures, like the overidentification test and Wald test, with standard formulas. It is important to stress that both estimation and testing work without requiring the knowledge of the various rates. However, the assessment of these rates is crucial for (asymptotic) power considerations. Possible applications include econometric problems with two dimensions of asymptotics, due to trimming, tail estimation, infill asymptotic, social interactions, kernel smoothing or any kind of regularization

    Testing Identification Strength

    Get PDF
    We consider models defined by a set of moment restrictions that may be subject to weak identification. Following the recent literature, the identification of the structural parameters is characterized by the Jacobian of the moment conditions. We unify several definitions of identification that have been used in the literature, and show how they are linked to the consistency and asymptotic normality of GMM estimators. We then develop two tests to assess the identification strength of the structural parameters. Both tests are straightforward to apply. In simulations, our tests are well-behaved when compared to contenders, both in terms of size and power

    Efficient Inference with Poor Instruments: a General Framework

    Get PDF
    We consider a general framework where weaker patterns of identifcation may arise: typically, the data generating process is allowed to depend on the sample size. However, contrary to what is usually done in the literature on weak identification, we do not give up the efficiency goal of statistical inference: even fragile information should be processed optimally for the purpose of both efficient estimation and powerful testing. Our main contribution is actually to consider that several patterns of identification may arise simultaneously. This heterogeneity of identification schemes paves the way for the device of optimal strategies for inferential use of information of poor quality. More precisely, we focus on a case where asymptotic efficiency of estimators is well-defined through the variance of asymptotically normal distributions. Standard efficient estimation procedures still hold, albeit with rates of convergence slower than usual. We stress that these are feasible without requiring the prior knowledge of the identification schemes
    • …
    corecore