13 research outputs found

    From Markov Chains to Stochastic Games

    No full text
    Markov chains1 and Markov decision processes (MDPs) are special cases of stochastic games. Markov chains describe the dynamics of the states of a stochastic game where each player has a single action in each state. Simi-larly, the dynamics of the states of a stochastic game form a Markov chai

    Membrane Assembly in Bacteria

    No full text
    corecore