research

Real Options using Markov Chains: an application to Production Capacity Decisions

Abstract

In this work we address investment decisions using real options. A standard numerical approach for valuing real options is dynamic programming. The basic idea is to establish a discrete-valued lattice of possible future values of the underlying stochastic variable (demand in our case). For most approaches in the literature, the stochastic variable is assumed normally distributed and then approximated by a binomial distribution, resulting in a binomial lattice. In this work, we investigate the use of a sparse Markov chain to model such variable. The Markov approach is expected to perform better since it does not assume any type of distribution for the demand variation, the probability of a variation on the demand value is dependent on the current demand value and thus, no longer constant, and it generalizes the binomial lattice since the latter can be modelled as a Markov chain. We developed a stochastic dynamic programming model that has been implemented both on binomial and Markov models. A numerical example of a production capacity choice problem has been solved and the results obtained show that the investment decisions are different and, as expected the Markov chain approach leads to a better investment policy.Flexible Capacity Investments, Real Options, Markov Chains, Dynamic Programming

    Similar works