Finite State Dynamic Games with Asymmetric Information: A Computational Framework
This paper develops a relatively simple method for computing the Markov Perfect Equilibria of dynamic games with asymmetric information (see Maskin and Tirole (1992, 2001)). We consider a class of dynamic games in which there is finite number of active players in each period, each characterized by a vector of state variables. Some of these state variables are publicly observable while others are private information. In each period players' strategies consist of a set of continuous control and a set of discrete controls. Players' payoff at each period depend on the players characteristics at that period and their choice of controls. We focus however only on finite state dynamic games such that the sets of possible characteristics are finite. We use a reinforcement learning algorithm, similar to Pakes and McGuire (2001) for the complete information games. To illustrate our algorithm we use it to compute a MPE of an oligopolistic industry organized as a legal cartel firms knows their own costs but do not observe the random outcomes of the investment processes of their competitors.
To our knowledge, this item is not available for
download. To find whether it is available, there are three
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
|Date of creation:||2004|
|Date of revision:|
|Contact details of provider:|| Postal: |
Web page: http://www.EconomicDynamics.org/society.htm
More information through EDIRC
When requesting a correction, please mention this item's handle: RePEc:red:sed004:41. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Christian Zimmermann)
If references are entirely missing, you can add them using this form.