Pareto Frontier of a Dynamic Principal–Agent Model with Discrete Actions: An Evolutionary Multi-Objective Approach
In this article, a dynamic Principal–Agent model with discrete actions is analysed from a Multi-Objective optimization framework. As a result, a concave Pareto Frontier is numerically approximated. The concavity of the Pareto Frontier is a consequence of the information asymmetry between the Principal and the Agent. The underlying Multi-Objective framework allows us to consider more powerful assumptions than those used in the traditional Single-Objective optimization approach. As contracts move in the Pareto Frontier (trade-off surface) towards those that are more advantageous to the Agent, the prevalence of compensation plans in which the Principal assumes most of the risk of the productive activity are observed. When the Principal and the Agent are more patient, both obtain higher values of their discounted expected utilities, which generates a higher level of economic surplus. The Agent faces lower variability in future compensation when it is costlier for him to exert an additional effort unit. Finally, a new Multi-Objective Evolutionary Algorithm (MOEA) is proposed in this article to approximate Pareto Frontiers, such algorithm involves an innovative ranking-mutation mechanism which promotes approximations with good spread, achieving even better results that some obtained by already well known MOEAs. Copyright Springer Science+Business Media, LLC. 2012
Volume (Year): 40 (2012)
Issue (Month): 4 (December)
|Contact details of provider:|| Web page: http://www.springer.com|
|Order Information:||Web: http://www.springer.com/economics/economic+theory/journal/10614/PS2|
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Rubinstein, Ariel, 1982.
"Perfect Equilibrium in a Bargaining Model,"
Econometric Society, vol. 50(1), pages 97-109, January.
- Ariel Rubinstein, 2010. "Perfect Equilibrium in a Bargaining Model," Levine's Working Paper Archive 661465000000000387, David K. Levine.
- Ariel Rubinstein, 2010. "Perfect Equilibrium in a Bargaining Model," Levine's Working Paper Archive 252, David K. Levine.
- Stephen E. Spear & Sanjay Srivastava, 1987. "On Repeated Moral Hazard with Discounting," Review of Economic Studies, Oxford University Press, vol. 54(4), pages 599-617.
- Wang, Cheng, 1997. "Incentives, CEO Compensation, and Shareholder Wealth in a Dynamic Agency Model," Journal of Economic Theory, Elsevier, vol. 76(1), pages 72-105, September.
- Wang, C., 1995. "Incentives, CEO Compensation, and Shareholder Wealth in a Dynamic Agency Model," GSIA Working Papers 1995-08, Carnegie Mellon University, Tepper School of Business.
- Wang, Cheng, 1997. "Incentives, CEO Compensation and Shareholder Wealth in a Dynamic Agency Model," Staff General Research Papers Archive 5170, Iowa State University, Department of Economics.
- Fernandes, Ana & Phelan, Christopher, 2000. "A Recursive Formulation for Repeated Agency with History Dependence," Journal of Economic Theory, Elsevier, vol. 91(2), pages 223-247, April.
- Ana Fernandes & Christopher Phelan, 1999. "A recursive formulation for repeated agency with history dependence," Staff Report 259, Federal Reserve Bank of Minneapolis.
- Dominique Demougin & Carsten Helm, 2006. "Moral Hazard and Bargaining Power," German Economic Review, Verein für Socialpolitik, vol. 7, pages 463-470, November.
- Bengt Holmstrom, 1979. "Moral Hazard and Observability," Bell Journal of Economics, The RAND Corporation, vol. 10(1), pages 74-91, Spring.
When requesting a correction, please mention this item's handle: RePEc:kap:compec:v:40:y:2012:i:4:p:415-443. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sonal Shukla)or (Rebekah McClure)
If references are entirely missing, you can add them using this form.