1,000 research outputs found
Density functionals, with an option-pricing application
We present a method of estimating density-related functionals, without prior knowledge of the density’s functional form. The approach revolves around the specification of an explicit formula for a new class of distributions that encompasses many of the known cases in statistics, including the normal, gamma, inverse gamma, and mixtures thereof. The functionals are based on a couple of hypergeometric functions. Their parameters can be estimated, and the estimates then reveal both the functional form of the density and the parameters that determine centering, scaling, etc. The function to be estimated always leads to a valid density, by design, namely, one that is nonnegative everywhere and integrates to 1. Unlike fully nonparametric methods, our approach can be applied to small datasets. To illustrate our methodology, we apply it to finding risk-neutral densities associated with different types of financial options. We show how our approach fits the data uniformly very well. We also find that our estimated densities’ functional forms vary over the dataset, so that existing parametric methods will not do uniformly well
The joint moment generating function of quadratic forms in multivariate autoregressive series - The case with deterministic components
Let {X-t} follow a discrete Gaussian vector autoregression with deterministic components. We derive the exact finite-sample joint moment generating function (MGF) of the quadratic forms that form the basis for the sufficient statistic. The formula is then specialized to the limiting MGF of functionals involving multivariate and univariate Ornstein–Uhlenbeck processes, drifts, and time trends. Such processes arise asymptotically from more general non-Gaussian processes and also from the Gaussian {X-t} and have also been used in areas other than time series,such as the “goodness of fit” literature
Is the economic crisis over (and out)?
This note analyzes the recent global recession: its causes, the predictability of the timing of its start and of its end, and the implications for macro policy. These follow from the general-equilibrium macro model of Abadir and Talmain (2002) and its implications for a new type of macroeconometrics. The note also proposes some banking regulations, and presents prospects for the future.recession, recovery, causes and symptoms, turning points, prediction, macro policy
EXPLICIT SOLUTIONS FOR THE ASYMPTOTICALLY-OPTIMAL BANDWIDTH IN CROSS VALIDATION
Least squares cross-validation (CV) methods are often used for automated bandwidth selection. We show that they share a common structure which has an explicit asymptotic solution. Using the framework of density estimation, we consider unbiased, biased, and smoothed CV methods. We show that, with a Student t(nu) kernel which includes the Gaussian as a special case, the CV criterion becomes asymptotically equivalent to a simple polynomial. This leads to optimal-bandwidth solutions that dominate the usual CV methods, definitely in terms of simplicity and speed of calculation, but also often in terms of integrated squared error because of the robustness of our asymptotic solution. We present simulations to illustrate these features and to give practical guidance on the choice of nu.bandwidth choice; cross validation; nonparametric density es- timation; analytical solution
Aggregation, Persistence and Volatility in a Macromodel
Starting from microeconomic foundations, we derive a general formula for the aggregation of outputs of heterogeneous firms (or sectors), and we solve explicitly for the fundamental intertemporal equilibrium path of the aggregate economy. The firms are subject to temporary technology shocks, but the aggregate output has radically different dynamical properties, and a special form of long memory and nonlinearity never used hitherto. We study, analytically, the implied time series properties of the new process characterizing aggregate GDP per capita. This process is more persistent than any dynamically-stable linear process (e.g. autoregressions) and yet is mean-reverting (unlike unit-root processes), and its volatility is of a greater order of magnitude than that of any of its components. This amplification of volatility means that even small shocks at the micro level can lead to large fluctuations at the macro level. The process is also characterized by long cycles which have random lengths and which are asymmetric. Increased monopoly power will tend to reduce the amplitude and increase the persistence of business cycles. Strikingly, we find that the nonlinear aggregate process has an S-shaped decay of memory, similar to the data but unlike linear time series models such as the widely-used Auto-Regressive Integrated Moving-Average (ARIMA) processes and their special cases (including fractional Integration).
Macro and Financial Markets: The Memory of an Elephant?
Macroeconomic and aggregate financial series share an unconventional type of nonlinear dynamics. Existing techniques (like co-integration) model these dynamics incompletely, hence generating seemingly paradoxical results.
To avoid this, we provide a methodology to disentangle the long-run relation between variables from their own dynamics, and illustrate with two applications.
First, in the forward-premium puzzle, adding a component quantifying the persistent nonlinear dynamics of exchange rates yields substantial predictability and makes the forward-premium term insignificant. Second, S&P 500 grows in a pattern of momentum followed by reversal, forming long cycles around a trend given by GDP, a stable non-breaking relation since WWII.
Classification-Keywords:
On efficient simulation in dynamic models
Ways of improving the efficiency of Monte-Carlo (MC) techniques are studied for dynamic models. Such models cause the conventional Antithetic Variate (AV) technique to fail, and will be proved to reduce the benefit from using Control Variates with nearly nonstationary series. This paper suggests modifications of the two conventional variance reduction techniques to enhance their efficiency. New classes of AVs are also proposed. Methods of reordering innovations are found to do less well than others which rely on changing some signs in the spirit of the traditional AV. Numerical and analytical calculations are given to investigate the features of the proposed techniques. JEL classification code: C15 Key words: Dynamic models, Monte-Carlo (MC), Variance Reduction Technique (VRT), Antithetic Variate (AV), Control Variate (CV), Efficiency Gain (EG), Response Surface (RS).
Nelson-Plosser Revisited: the ACF Approach
We detect a new stylized fact about the common dynamics of macroeconomic and financial aggregates. The rate of decay of the memory (or persistence) of these series is depicted by their autocorrelation functions (ACFs), and they all fit very closely a parsimonious four-parameter functional form that we present. Not only does our formula fit the data better than the ones that arise from autoregressive models, but it also yields the correct shape of the ACF. This can help policymakers understand the lags with which an economy evolves, and its turning points.
- …