Font Size: a A A

Stochastically stable states for perturbed repeated play of coordination games

Posted on:2003-05-23Degree:Ph.DType:Dissertation
University:University of Illinois at Urbana-ChampaignCandidate:Anderson, Mark DanielFull Text:PDF
GTID:1466390011488260Subject:Mathematics
Abstract/Summary:
Perturbed repeated play of a two-player two-move coordination game is modeled as an irreducible Markov process on a set of history states describing the most recent m stages of play. At every stage, each player draws a sample from the history state to forecast his opponent's behavior and either moves to maximize his single-stage expected payoff or commits an error. As the error rate approaches zero the stationary distributions converge to a stationary distribution for the unperturbed process. Stochastically stable states, which comprise the support of the limiting distribution, are identified by finding minimum-weight spanning trees of a weighted directed graph on the set of recurrent classes for the unperturbed process. When the sample size equals the memory length m, cycles may be present in the unperturbed process. Necessary and sufficient conditions for the existence of stochastically stable cycle states are provided. The stochastically stable states are identified, for sufficiently large sample sizes and memory lengths, as those states representing repeated play of a risk-dominant Nash equilibrium. Finally, five coordination conditions are introduced to characterize N-player coordination games.
Keywords/Search Tags:Repeated play, Coordination, States, Process
Related items