Efficient Markov perfect Nash equilibria: theory and application to dynamic fishery games
AbstractIn this paper, we present a method for the characterization of Markov perfect Nash equilibria being Pareto efficient in non-linear differential games. For that purpose, we use a new method for computing Nash equilibria with Markov strategies by means of a system of quasilinear partial differential equations. We apply the necessary and sufficient conditions derived to characterize efficient Markov perfect Nash equilibria to dynamic fishery games.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by Universidad Carlos III de Madrid in its series Open Access publications from Universidad Carlos III de Madrid with number info:hdl:10016/5577.
Length: 1098 p.
Date of creation: 2005
Date of revision:
Publication status: Published in Journal of Economic Dynamics and Control (2005) v. 29, p.1073-1096
Contact details of provider:
Web page: http://www.uc3m.es
Differential games; Markov-perfect Nash equilibria; Pareto optimum; Quaslinear partial differential equations; Fishery management;
Other versions of this item:
- Martin-Herran, G. & Rincon-Zapatero, J.P., 2005. "Efficient Markov perfect Nash equilibria: theory and application to dynamic fishery games," Journal of Economic Dynamics and Control, Elsevier, vol. 29(6), pages 1073-1096, June.
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Chiarella, Carl, et al, 1984. "On the Economics of International Fisheries," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 25(1), pages 85-92, February.
- Rincon-Zapatero, J. P., 2004.
"Characterization of Markovian equilibria in a class of differential games,"
Journal of Economic Dynamics and Control,
Elsevier, vol. 28(7), pages 1243-1266, April.
- Rincón-Zapatero, Juan Pablo, 2004. "Characterization of Markovian equilibria in a class of differential games," Open Access publications from Universidad Carlos III de Madrid info:hdl:10016/5575, Universidad Carlos III de Madrid.
- Ehtamo, Harri & Hamalainen, Raimo P., 1993. "A cooperative incentive equilibrium for a resource management problem," Journal of Economic Dynamics and Control, Elsevier, vol. 17(4), pages 659-678, July.
- Dockner,Engelbert J. & Jorgensen,Steffen & Long,Ngo Van & Sorger,Gerhard, 2000. "Differential Games in Economics and Management Science," Cambridge Books, Cambridge University Press, number 9780521637329, October.
- Haurie, Alain & Pohjola, Matti, 1987. "Efficient equilibria in a differential game of capitalism," Journal of Economic Dynamics and Control, Elsevier, vol. 11(1), pages 65-78, March.
- Wang, Hefei, 2012. "Costly information transmission in continuous time with implications for credit rating announcements," Journal of Economic Dynamics and Control, Elsevier, vol. 36(9), pages 1402-1413.
- Ricardo Josa-Fombellida & Juan Pablo Rincón-Zapatero, 2008. "Markov Perfect Nash Equilibrium in stochastic differential games as solution of a generalized Euler Equations System," Economics Working Papers we086731, Universidad Carlos III, Departamento de Economía.
- Ngo Long, 2011. "Dynamic Games in the Economics of Natural Resources: A Survey," Dynamic Games and Applications, Springer, vol. 1(1), pages 115-148, March.
- Beard, Rodney, 2008. "A dynamic model of renewable resource harvesting with Bertrand competition," MPRA Paper 8916, University Library of Munich, Germany.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Contact person).
If references are entirely missing, you can add them using this form.