Strategic experimentation with Poisson bandits
We study a game of strategic experimentation with two-armed bandits where the risky arm distributes lump-sum payoffs according to a Poisson process. Its intensity is either high or low, and unknown to the players. We consider Markov perfect equilibria with beliefs as the state variable and show that all such equilibria exhibit an 'encouragement effect' relative to the single-agent optimum. There is no equilibrium in which all players use cut-off strategies. Owing to the encouragement effect, asymmetric equilibria in which players take turns playing the risky arm before all experimentation stops Pareto dominate the unique symmetric equilibrium. Rewarding the last experimenter with a higher continuation value increases the range of beliefs where players experiment, but may reduce the intensity of experimentation at more optimistic beliefs. This suggests that there is no equilibrium that uniformly maximizes the players' average payoff.
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Decamps, Jean-Paul & Mariotti, Thomas, 2004. "Investment timing and learning externalities," Journal of Economic Theory, Elsevier, vol. 118(1), pages 80-102, September.
- Nicolas Klein & Sven Rady, 2011.
"Negatively Correlated Bandits,"
Review of Economic Studies,
Oxford University Press, vol. 78(2), pages 693-732.
- Klein, Nicolas & Rady, Sven, 2008. "Negatively Correlated Bandits," CEPR Discussion Papers 6983, C.E.P.R. Discussion Papers.
- Klein, Nicolas & Rady, Sven, 2008. "Negatively Correlated Bandits," Discussion Paper Series of SFB/TR 15 Governance and the Efficiency of Economic Systems 243, Free University of Berlin, Humboldt University of Berlin, University of Bonn, University of Mannheim, University of Munich.
- Klein, Nicolas & Rady, Sven, 2008. "Negatively Correlated Bandits," Discussion Papers in Economics 5332, University of Munich, Department of Economics.
- Sven Rady & Nicolas Klein, 2008. "Negatively Correlated Bandits," 2008 Meeting Papers 136, Society for Economic Dynamics.
- Nicolas Klein & Sven Rady, 2008. "Negatively Correlated Bandits," Working Papers 040, Bavarian Graduate Program in Economics (BGPE).
- Nicolas Klein, 2009. "Free-Riding And Delegation In Research Teams," 2009 Meeting Papers 253, Society for Economic Dynamics.
- Godfrey Keller & Sven Rady & Martin Cripps, 2005. "Strategic Experimentation with Exponential Bandits," Econometrica, Econometric Society, vol. 73(1), pages 39-68, January.
- Cripps, Martin William & Keller, R Godfrey & Rady, Sven, 2003. "Strategic Experimentation with Exponential Bandits," CEPR Discussion Papers 3814, C.E.P.R. Discussion Papers.
- Cripps, Martin & Keller, Godfrey & Rady, Sven, 2003. "Strategic Experimentation with Exponential Bandits," Discussion Papers in Economics 4, University of Munich, Department of Economics.
- Patrick Bolton & Christopher Harris, 1999. "Strategic Experimentation," Econometrica, Econometric Society, vol. 67(2), pages 349-374, March.
- Rothschild, Michael, 1974. "A two-armed bandit theory of market pricing," Journal of Economic Theory, Elsevier, vol. 9(2), pages 185-202, October.
- Guiseppe Moscarini & Francesco Squintani, 2004. "Competitive Experimentation with Private Information," Cowles Foundation Discussion Papers 1489, Cowles Foundation for Research in Economics, Yale University.
- Dirk Bergemann & Juuso Valimaki, 2006. "Bandit Problems," Cowles Foundation Discussion Papers 1551, Cowles Foundation for Research in Economics, Yale University. Full references (including those not matched with items on IDEAS)
When requesting a correction, please mention this item's handle: RePEc:the:publsh:595. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Martin J. Osborne)
If references are entirely missing, you can add them using this form.