13 research outputs found
From Markov Chains to Stochastic Games
Markov chains1 and Markov decision processes (MDPs) are special cases of stochastic games. Markov chains describe the dynamics of the states of a stochastic game where each player has a single action in each state. Simi-larly, the dynamics of the states of a stochastic game form a Markov chai