Achieving Pareto Optimality Through Distributed Learning
We propose a simple payoff-based learning rule that is completely decentralized, and that leads to an efficient configuaration of actions in any n-person finite strategic-form game with generic payoffs. The algorithm follows the theme of exploration versus exploitation and is hence stochastic in nature. We prove that if all agents adhere to this algorithm, then the agents will select the action profile that maximizes the sum of the agents' payoffs a high percentage of time. The algorithm requires no communication. Agents respond solely to changes in their own realized payoffs, which are affected by the actions of other agents in the system in ways that they do not necessarily understand. The method can be applied to the optimization of complex systems with many distributed components, such as the routing of information in networks and the design and control of wind farms. The proof of the proposed learning algorithm relies on the theory of large deviations for perturbed Markov chains.
|Date of creation:||01 Jul 2011|
|Contact details of provider:|| Postal: Manor Rd. Building, Oxford, OX1 3UQ|
Web page: https://www.economics.ox.ac.uk/
More information through EDIRC
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Hart, Sergiu & Mas-Colell, Andreu, 2006.
"Stochastic uncoupled dynamics and Nash equilibrium,"
Games and Economic Behavior,
Elsevier, vol. 57(2), pages 286-303, November.
- Sergiu Hart & Andreu Mas-Colell, 2004. "Stochastic Uncoupled Dynamics and Nash Equilibrium," Levine's Bibliography 122247000000000466, UCLA Department of Economics.
- Sergiu Hart & Andreu Mas-Colell, 2004. "Stochastic Uncoupled Dynamics and Nash Equilibrium," Discussion Paper Series dp371, The Federmann Center for the Study of Rationality, the Hebrew University, Jerusalem.
- Sergiu Hart & Andreu Mas-Colell, 2004. "Stochastic Uncoupled Dynamics and Nash Equilibrium," Working Papers 174, Barcelona Graduate School of Economics.
- Sergiu Hart & Andreu Mas-Colell, 2004. "Stochastic uncoupled dynamics and Nash equilibrium," Economics Working Papers 783, Department of Economics and Business, Universitat Pompeu Fabra.
- Young, H Peyton, 1993. "The Evolution of Conventions," Econometrica, Econometric Society, vol. 61(1), pages 57-84, January.
- Fudenberg, Drew & Maskin, Eric, 1986. "The Folk Theorem in Repeated Games with Discounting or with Incomplete Information," Econometrica, Econometric Society, vol. 54(3), pages 533-554, May.
- Foster, Dean P. & Young, H. Peyton, 2006. "Regret testing: learning to play Nash equilibrium without knowing you have an opponent," Theoretical Economics, Econometric Society, vol. 1(3), pages 341-367, September.
- Young, H. Peyton, 2009. "Learning by trial and error," Games and Economic Behavior, Elsevier, vol. 65(2), pages 626-643, March.
When requesting a correction, please mention this item's handle: RePEc:oxf:wpaper:557. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Anne Pouliquen)
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If references are entirely missing, you can add them using this form.
If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.
Please note that corrections may take a couple of weeks to filter through the various RePEc services.