Updating Strategies Through Observed Play - Optimization Under Bounded Rationality
AbstractIndividuals repeatedly face a multi-decision task with unknown payoff distributions. They have minimal memory and update their strategy by observing previous play (and not strategy) of someone else. We select behavior rules that increase average payoffs as often as possible in a large population where all use the same rule. Here imitation generalizes to a pasting procedure. When decisions within the task are unrelated, individuals eventually learn the efficient strategy but the underlying dynamic is not monotone. However, when choices influence which decisions are subsequently faced in the task, play may not be efficient in the long run as it approaches a Nash equilibrium of the agent normal form.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by University of Bonn, Germany in its series Discussion Paper Serie B with number 432.
Date of creation: Apr 1998
Date of revision:
Contact details of provider:
Postal: Bonn Graduate School of Economics, University of Bonn, Adenauerallee 24 - 26, 53113 Bonn, Germany
Fax: +49 228 73 6884
Web page: http://www.bgse.uni-bonn.de/index.php?id=517
Multi-Armed Bandit; improving; undominated behavioral rule; play-wise imitating; replicator dynamic; monotone dynamics;
Find related papers by JEL classification:
- C72 - Mathematical and Quantitative Methods - - Game Theory and Bargaining Theory - - - Noncooperative Games
- C79 - Mathematical and Quantitative Methods - - Game Theory and Bargaining Theory - - - Other
You can help add them by filling out this form.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (BGSE Office).
If references are entirely missing, you can add them using this form.