9,346 research outputs found
Time varying VARs with inequality restrictions
In many applications involving time-varying parameter VARs, it is desirable to restrict the VAR coe¢ cients at each point in time to be non-explosive. This is an example of a problem where inequality restrictions are imposed on states in a state space model. In this paper, we describe how existing MCMC algorithms for imposing such inequality restrictions can work poorly (or not at all) and suggest alternative algorithms which exhibit better performance. Furthermore, previous algorithms involve an approximation relating to a key integrating constant. Our algorithms are exact, not involving this approximation. In an application involving a commonly-used U.S. data set, we show how this approximation can be a poor one and present evidence that the algorithms proposed in this paper work well
The vector floor and ceiling model
This paper motivates and develops a nonlinear extension of the Vector Autoregressive model which we call the Vector Floor and Ceiling model. Bayesian and classical methods for estimation and testing are developed and compared in the context of an application involving U.S. macroeconomic data. In terms of statistical significance both classical and Bayesian methods indicate that the (Gaussian) linear model is inadequate. Using impulse response functions we investigate the economic significance of the statistical analysis. We find evidence of strong nonlinearities in the contemporaneous relationships between the variables and milder evidence of nonlinearity in the conditional mean
Are apparent findings of nonlinearity due to structural instability in economic time series?
Many modelling issues and policy debates in macroeconomics depend on whether macroeconomic times series are best characterized as linear or nonlinear. If departures from linearity exist, it is important to know whether these are endogenously generated (as in, e.g., a threshold autoregressive model) or whether they merely reflect changing structure over time. We advocate a Bayesian approach and show how such an approach can be implemented in practice. An empirical exercise involving several macroeconomic time series shows that apparent findings of threshold type nonlinearities could be due to structural instability
Prior elicitation in multiple change-point models
This paper discusses Bayesian inference in change-point models. Existing approaches involve placing a (possibly hierarchical) prior over a known number of change-points. We show how two popular priors have some potentially undesirable properties (e.g. allocating excessive prior weight to change-points near the end of the sample) and discuss how these properties relate to imposing a fixed number of changepoints in-sample. We develop a new hierarchical approach which allows some of of change-points to occur out-of sample. We show that this prior has desirable properties and handles the case where the number of change-points is unknown. Our hierarchical approach can be shown to nest a wide variety of change-point models, from timevarying parameter models to those with few (or no) breaks. Since our prior is hierarchical, data-based learning about the parameter which controls this variety occurs
The Vector Floor and Ceiling Model
This paper motivates and develops a nonlinear extension of the Vector Autoregressive model which we call the Vector Floor and Ceiling model. Bayesian and classical methods for estimation and testing are developed and compared in the context of an application involving U.S. macroeconomic data. In terms of statistical significance both classical and Bayesian methods indicate that the (Gaussian) linear model is inadequate. Using impulse response functions we investigate the economic significance of the statistical analysis. We find evidence of strong nonlinearities in the contemporaneous relationships between the variables and milder evidence of nonlinearity in the conditional mean.Nonlinearity; Bayesian; Vector Autoregression
Distribution-Preserving Statistical Disclosure Limitation
One approach to limiting disclosure risk in public-use microdata is to release multiply-imputed, partially synthetic data sets. These are data on actual respondents, but with confidential data replaced by multiply-imputed synthetic values. A mis-specified imputation model can invalidate inferences because the distribution of synthetic data is completely determined by the model used to generate them. We present two practical methods of generating synthetic values when the imputer has only limited information about the true data generating process. One is applicable when the true likelihood is known up to a monotone transformation. The second requires only limited knowledge of the true likelihood, but nevertheless preserves the conditional distribution of the confidential data, up to sampling error, on arbitrary subdomains. Our method maximizes data utility and minimizes incremental disclosure risk up to posterior uncertainty in the imputation model and sampling error in the estimated transformation. We validate the approach with a simulation and application to a large linked employer-employee database.statistical disclosure limitation; confidentiality; privacy; multiple imputation; partially synthetic data
Forecasting in Large Macroeconomic Panels using Bayesian Model Averaging
This paper considers the problem of forecasting in large macroeconomic panels using Bayesian model averaging. Theoretical justifications for averaging across models, as opposed to selecting a single model, are given. Practical methods for implementing Bayesian model averaging with factor models are described. These methods involve algorithms which simulate from the space defined by all possible models. We discuss how these simulation algorithms can also be used to select the model with the highest marginal likelihood (or highest value of an information criterion) in an efficient manner. We apply these methods to the problem of forecasting GDP and inflation using quarterly U.S. data on 162 time series. For both GDP and inflation, we find that the models which contain factors do out-forecast an AR(p), but only by a relatively small amount and only at short horizons. We attribute these findings to the presence of structural instability and the fact that lags of dependent variable seem to contain most of the information relevant for forecasting. Relative to the small forecasting gains provided by including factors, the gains provided by using Bayesian model averaging over forecasting methods based on a single model are appreciable.
A semi-invertible Oseledets Theorem with applications to transfer operator cocycles
Oseledets' celebrated Multiplicative Ergodic Theorem (MET) is concerned with
the exponential growth rates of vectors under the action of a linear cocycle on
R^d. When the linear actions are invertible, the MET guarantees an
almost-everywhere pointwise splitting of R^d into subspaces of distinct
exponential growth rates (called Lyapunov exponents). When the linear actions
are non-invertible, Oseledets' MET only yields the existence of a filtration of
subspaces, the elements of which contain all vectors that grow no faster than
exponential rates given by the Lyapunov exponents. The authors recently
demonstrated that a splitting over R^d is guaranteed even without the
invertibility assumption on the linear actions. Motivated by applications of
the MET to cocycles of (non-invertible) transfer operators arising from random
dynamical systems, we demonstrate the existence of an Oseledets splitting for
cocycles of quasi-compact non-invertible linear operators on Banach spaces.Comment: 26 page
Coherent sets for nonautonomous dynamical systems
We describe a mathematical formalism and numerical algorithms for identifying
and tracking slowly mixing objects in nonautonomous dynamical systems. In the
autonomous setting, such objects are variously known as almost-invariant sets,
metastable sets, persistent patterns, or strange eigenmodes, and have proved to
be important in a variety of applications. In this current work, we explain how
to extend existing autonomous approaches to the nonautonomous setting. We call
the new time-dependent slowly mixing objects coherent sets as they represent
regions of phase space that disperse very slowly and remain coherent. The new
methods are illustrated via detailed examples in both discrete and continuous
time
- …