32,568 research outputs found

    Sequential Monte Carlo Methods for Estimating Dynamic Microeconomic Models

    Get PDF
    This paper develops methods for estimating dynamic structural microeconomic models with serially correlated latent state variables. The proposed estimators are based on sequential Monte Carlo methods, or particle filters, and simultaneously estimate both the structural parameters and the trajectory of the unobserved state variables for each observational unit in the dataset. We focus two important special cases: single agent dynamic discrete choice models and dynamic games of incomplete information. The methods are applicable to both discrete and continuous state space models. We first develop a broad nonlinear state space framework which includes as special cases many dynamic structural models commonly used in applied microeconomics. Next, we discuss the nonlinear filtering problem that arises due to the presence of a latent state variable and show how it can be solved using sequential Monte Carlo methods. We then turn to estimation of the structural parameters and consider two approaches: an extension of the standard full-solution maximum likelihood procedure (Rust, 1987) and an extension of the two-step estimation method of Bajari, Benkard, and Levin (2007), in which the structural parameters are estimated using revealed preference conditions. Finally, we introduce an extension of the classic bus engine replacement model of Rust (1987) and use it both to carry out a series of Monte Carlo experiments and to provide empirical results using the original data.dynamic discrete choice, latent state variables, serial correlation, sequential Monte Carlo methods, particle filtering

    Techno-economic comparison of operational aspects for direct drive and gearbox-driven wind turbines

    Get PDF
    The majority of wind turbines currently in operation have the conventional Danish concept design-that is, the three-bladed rotor of such turbines is indirectly coupled with an electrical generator via a gearbox. Recent technological developments have enabled direct drive wind turbines to become economically feasible. Potentially, direct drive wind turbines may enjoy higher levels of availability due to the removal of the gearbox from the design. However, this is only a theory: so far not substantiated by detailed analytic calculation. By providing such a calculation, this paper enables us to quantitatively evaluate technical and economic merits of direct drive and gearbox-driven wind turbines

    Another Look at the Identification of Dynamic Discrete Decision Processes

    Get PDF
    This paper presents an econometric approach to estimate the behavioral effects of counterfactual policy experiments in the context of dynamic decision models where the current utility function and the distribution of unobservables are nonparametrically specified. Previous studies have shown that the identification of the current utility function in dynamic decision models requires of stronger assumptions than in static decision models. We show in this paper that knowledge of the current utility function (or of a 'normalized' utility function) is not necessary to identify counterfactual choice probabilities in dynamic models. To identify these counterfactuals we need the probability distribution of the unobservables and the difference between the present value of choosing always the same alternative and the present value of deviating one period from this strategy. We show that both functions are identified from the factual choice probabilities under similar conditions as in static decision models. Based on this result we propose a nonparametric procedure to estimate the behavioral effects of counterfactual experiments in dynamic decision models. We apply this method to evaluate the effects of an investment subsidy program in the context of a model of machine replacement.Dynamic discrete decision processes; Nonparametric Identification; Counterfactual experiments.

    Fusing Loop and GPS Probe Measurements to Estimate Freeway Density

    Full text link
    In an age of ever-increasing penetration of GPS-enabled mobile devices, the potential of real-time "probe" location information for estimating the state of transportation networks is receiving increasing attention. Much work has been done on using probe data to estimate the current speed of vehicle traffic (or equivalently, trip travel time). While travel times are useful to individual drivers, the state variable for a large class of traffic models and control algorithms is vehicle density. Our goal is to use probe data to supplement traditional, fixed-location loop detector data for density estimation. To this end, we derive a method based on Rao-Blackwellized particle filters, a sequential Monte Carlo scheme. We present a simulation where we obtain a 30\% reduction in density mean absolute percentage error from fusing loop and probe data, vs. using loop data alone. We also present results using real data from a 19-mile freeway section in Los Angeles, California, where we obtain a 31\% reduction. In addition, our method's estimate when using only the real-world probe data, and no loop data, outperformed the estimate produced when only loop data were used (an 18\% reduction). These results demonstrate that probe data can be used for traffic density estimation

    Simulation in Statistics

    Full text link
    Simulation has become a standard tool in statistics because it may be the only tool available for analysing some classes of probabilistic models. We review in this paper simulation tools that have been specifically derived to address statistical challenges and, in particular, recent advances in the areas of adaptive Markov chain Monte Carlo (MCMC) algorithms, and approximate Bayesian calculation (ABC) algorithms.Comment: Draft of an advanced tutorial paper for the Proceedings of the 2011 Winter Simulation Conferenc

    Sequential Monte Carlo EM for multivariate probit models

    Full text link
    Multivariate probit models (MPM) have the appealing feature of capturing some of the dependence structure between the components of multidimensional binary responses. The key for the dependence modelling is the covariance matrix of an underlying latent multivariate Gaussian. Most approaches to MLE in multivariate probit regression rely on MCEM algorithms to avoid computationally intensive evaluations of multivariate normal orthant probabilities. As an alternative to the much used Gibbs sampler a new SMC sampler for truncated multivariate normals is proposed. The algorithm proceeds in two stages where samples are first drawn from truncated multivariate Student tt distributions and then further evolved towards a Gaussian. The sampler is then embedded in a MCEM algorithm. The sequential nature of SMC methods can be exploited to design a fully sequential version of the EM, where the samples are simply updated from one iteration to the next rather than resampled from scratch. Recycling the samples in this manner significantly reduces the computational cost. An alternative view of the standard conditional maximisation step provides the basis for an iterative procedure to fully perform the maximisation needed in the EM algorithm. The identifiability of MPM is also thoroughly discussed. In particular, the likelihood invariance can be embedded in the EM algorithm to ensure that constrained and unconstrained maximisation are equivalent. A simple iterative procedure is then derived for either maximisation which takes effectively no computational time. The method is validated by applying it to the widely analysed Six Cities dataset and on a higher dimensional simulated example. Previous approaches to the Six Cities overly restrict the parameter space but, by considering the correct invariance, the maximum likelihood is quite naturally improved when treating the full unrestricted model.Comment: 26 pages, 2 figures. In press, Computational Statistics & Data Analysi

    Testing for Common Values in Canadian Treasury Bill Auctions

    Get PDF
    We develop a test for common values in auctions in which some bidders possess information about rivals’ bids. This information causes a bidder to bid differently when she has a private value than when her value depends on rivals’ information. In a divisible good setting, such as treasury bill auctions, bidders with private values who obtain information about rivals’ bids use this information only to update their prior about the distribution of residual supply. In the model with a common value component, they also update their prior about the value of the good being auctioned.We apply the data from the Canadian treasury bill market, where some bidders have to route their bids through dealers who also submit bids on their own. Furthermore, we use the structural model to estimate the value of customer order flow to a dealer. We find that the extra information contained in customers’ bids leads on average to an increase in payoff equal to about 0.5 of a basis point, or 32% of the expected surplus of dealers from participating in these auctions.multiunit auctions, treasury auctions, structural estimation, nonparametric identification and estimation, test for common value
    corecore