Which one should I imitate?
AbstractWe consider the model of social learning by Schlag (1996). Individuals must repeatedly choose an action in a multi-armed bandit. We assume that each indivdiual observes the outcomes of two other individuals' choices before her own next choice must be made -- the original model only allows for one observation. Selection of optimal behavior yields a variant of the proportional imitation rule -- the optimal rule based on one observation. When each individual uses this rule then the adaptation of actions in an infinite population follows an aggregate monotone dynamic.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by University of Bonn, Germany in its series Discussion Paper Serie B with number 365.
Date of creation: Mar 1996
Date of revision:
Contact details of provider:
Postal: Bonn Graduate School of Economics, University of Bonn, Adenauerallee 24 - 26, 53113 Bonn, Germany
Fax: +49 228 73 6884
Web page: http://www.bgse.uni-bonn.de
social learning; multi-armed bandit; imitation; payoff increasing; proportional imitaiton rule; aggregate monotone dynamic.;
Other versions of this item:
- C72 - Mathematical and Quantitative Methods - - Game Theory and Bargaining Theory - - - Noncooperative Games
- C79 - Mathematical and Quantitative Methods - - Game Theory and Bargaining Theory - - - Other
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Samuelson, Larry & Zhang, Jianbo, 1992. "Evolutionary stability in asymmetric games," Journal of Economic Theory, Elsevier, vol. 57(2), pages 363-391, August.
- Samuelson, L., 1989.
"Evolutionnary Stability In Asymmetric Games,"
11-8-2, Pennsylvania State - Department of Economics.
- Friedman, Daniel, 1991. "Evolutionary Games in Economics," Econometrica, Econometric Society, vol. 59(3), pages 637-66, May.
- Dan Friedman, 2010. "Evolutionary Games in Economics," Levine's Working Paper Archive 392, David K. Levine.
- Fudenberg, Drew & Ellison, Glenn, 1995.
"Word-of-Mouth Communication and Social Learning,"
3196300, Harvard University Department of Economics.
- repec:att:wimass:9612 is not listed on IDEAS
- Karl H. Schlag, .
"Why Imitate, and if so, How? A Bounded Rational Approach to Multi- Armed Bandits,"
ELSE working papers
028, ESRC Centre on Economics Learning and Social Evolution.
- Schlag, Karl H., 1998. "Why Imitate, and If So, How?, : A Boundedly Rational Approach to Multi-armed Bandits," Journal of Economic Theory, Elsevier, vol. 78(1), pages 130-156, January.
- Karl H. Schlag, 1995. "Why Imitate, and if so, How? A Bounded Rational Approach to Multi-Armed Bandits," Discussion Paper Serie B 361, University of Bonn, Germany, revised Mar 1996.
- Eshel, I. & Samuelson, L. & Shaked, A., 1996.
"Altruists, Egoists and Hooligans in a Local Interaction Model,"
9612r, Wisconsin Madison - Social Systems.
- Björnerstedt, Jonas & Karl H. Schlag, 1996.
"On the Evolution of Imitative Behavior,"
Discussion Paper Serie B
378, University of Bonn, Germany.
- Rothschild, Michael, 1974. "A two-armed bandit theory of market pricing," Journal of Economic Theory, Elsevier, vol. 9(2), pages 185-202, October.
- L. Samuelson & J. Zhang, 2010. "Evolutionary Stability in Asymmetric Games," Levine's Working Paper Archive 453, David K. Levine.
This item has more than 25 citations. To prevent cluttering this page, these citations are listed on a separate page. reading list or among the top items on IDEAS.Access and download statisticsgeneral information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (BGSE Office).
If references are entirely missing, you can add them using this form.