IDEAS home Printed from https://ideas.repec.org/p/iza/izadps/dp15496.html
   My bibliography  Save this paper

Social Preferences and Rating Biases in Subjective Performance Evaluations

Author

Listed:
  • Kusterer, David

    (University of Cologne)

  • Sliwka, Dirk

    (University of Cologne)

Abstract

We study the determinants of biases in subjective performance evaluations in an MTurk experiment to test the implications of a standard formal framework of rational subjective evaluations. In the experiment, subjects in the role of workers work on a real effort task. Subjects in the role of supervisors observe subsamples of the workers' output and assess their performance. We conduct 6 experimental treatments varying (i) whether workers' pay depends on the performance evaluation, (ii) whether supervisors are paid for the accuracy of their evaluations, and (iii) the precision of the information available to supervisors. In line with the predictions of the model of optimal evaluations we find that ratings are more lenient and less accurate when they determine bonus payments and that rewards for accuracy reduce leniency. When supervisors have access to more detailed performance information their ratings vary to a stronger extent with observed performance. In contrast to the model's prediction we do not find that more prosocial supervisors always provide more lenient ratings, but that they invest more time in the rating task and achieve a higher rating accuracy.

Suggested Citation

  • Kusterer, David & Sliwka, Dirk, 2022. "Social Preferences and Rating Biases in Subjective Performance Evaluations," IZA Discussion Papers 15496, Institute of Labor Economics (IZA).
  • Handle: RePEc:iza:izadps:dp15496
    as

    Download full text from publisher

    File URL: https://docs.iza.org/dp15496.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Alexander W Cappelen & Johanna Mollerstrom & Bjørn-Atle Reme & Bertil Tungodden, 2022. "A Meritocratic Origin of Egalitarian Behaviour," The Economic Journal, Royal Economic Society, vol. 132(646), pages 2101-2117.
    2. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    3. Marchegiani, Lucia & Reggiani, Tommaso & Rizzolli, Matteo, 2016. "Loss averse agents and lenient supervisors in performance appraisal," Journal of Economic Behavior & Organization, Elsevier, vol. 131(PA), pages 183-197.
    4. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    5. Thomas Dohmen & Armin Falk & David Huffman & Uwe Sunde, 2009. "Homo Reciprocans: Survey Evidence on Behavioural Outcomes," Economic Journal, Royal Economic Society, vol. 119(536), pages 592-612, March.
    6. Markussen, Thomas & Putterman, Louis & Tyran, Jean-Robert, 2016. "Judicial error and cooperation," European Economic Review, Elsevier, vol. 89(C), pages 372-388.
    7. Alexander Sebald & Markus Walzl, 2014. "Subjective Performance Evaluations and Reciprocity in Principal–Agent Relations," Scandinavian Journal of Economics, Wiley Blackwell, vol. 116(2), pages 570-590, April.
    8. Sarah C. Rice, 2012. "Reputation and Uncertainty in Online Markets: An Experimental Study," Information Systems Research, INFORMS, vol. 23(2), pages 436-452, June.
    9. Charles Bellemare & Alexander Sebald, 2018. "Self-Confidence and Reactions to Subjective Performance Evaluations," CESifo Working Paper Series 7325, CESifo.
    10. Grosch, Kerstin & Rau, Holger, 2017. "Gender differences in honesty: The role of social value orientation," University of Göttingen Working Papers in Economics 308, University of Goettingen, Department of Economics.
    11. Chen, Daniel L. & Schonger, Martin & Wickens, Chris, 2016. "oTree—An open-source platform for laboratory, online, and field experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 9(C), pages 88-97.
    12. Logan S. Casey & Jesse Chandler & Adam Seth Levine & Andrew Proctor & Dara Z. Strolovitch, 2017. "Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection," SAGE Open, , vol. 7(2), pages 21582440177, June.
    13. Axel Ockenfels & Dirk Sliwka & Peter Werner, 2015. "Bonus Payments and Reference Point Violations," Management Science, INFORMS, vol. 61(7), pages 1496-1513, July.
    14. Alexander W. Cappelen & Astri Drange Hole & Erik Ø Sørensen & Bertil Tungodden, 2007. "The Pluralism of Fairness Ideals: An Experimental Approach," American Economic Review, American Economic Association, vol. 97(3), pages 818-827, June.
    15. Bol, Jasmijn C. & Kramer, Stephan & Maas, Victor S., 2016. "How control system design affects performance evaluation compression: The role of information accuracy and outcome transparency," Accounting, Organizations and Society, Elsevier, vol. 51(C), pages 64-73.
    16. Prendergast, Canice & Topel, Robert H, 1996. "Favoritism in Organizations," Journal of Political Economy, University of Chicago Press, vol. 104(5), pages 958-978, October.
    17. Paolacci, Gabriele & Chandler, Jesse & Ipeirotis, Panagiotis G., 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Cambridge University Press, vol. 5(5), pages 411-419, August.
    18. Chen, Chia-Ching & Chiu, I-Ming & Smith, John & Yamada, Tetsuji, 2013. "Too smart to be selfish? Measures of cognitive ability, social preferences, and consistency," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 112-122.
    19. Kathrin Manthei & Dirk Sliwka, 2019. "Multitasking and Subjective Performance Evaluations: Theory and Evidence from a Field Experiment in a Bank," Management Science, INFORMS, vol. 65(12), pages 5861-5883, December.
    20. Canice Prendergast, 1999. "The Provision of Incentives in Firms," Journal of Economic Literature, American Economic Association, vol. 37(1), pages 7-63, March.
    21. repec:cup:judgdm:v:6:y:2011:i:8:p:771-781 is not listed on IDEAS
    22. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    23. Johannes Berger & Christine Harbring & Dirk Sliwka, 2013. "Performance Appraisals and the Impact of Forced Distribution--An Experimental Investigation," Management Science, INFORMS, vol. 59(1), pages 54-68, June.
    24. B. William Demeré & Karen L. Sedatole & Alexander Woods, 2019. "The Role of Calibration Committees in Subjective Performance Evaluation Systems," Management Science, INFORMS, vol. 65(4), pages 1562-1585, April.
    25. Grabner, Isabella & Künneke, Judith & Moers, Frank, 2020. "How calibration committees can mitigate performance evaluation bias: An analysis of implicit incentives," Department for Strategy and Innovation Working Paper Series 04/2020, WU Vienna University of Economics and Business.
    26. Golman, Russell & Bhatia, Sudeep, 2012. "Performance evaluation inflation and compression," Accounting, Organizations and Society, Elsevier, vol. 37(8), pages 534-543.
    27. Murphy, Ryan O. & Ackermann, Kurt A. & Handgraaf, Michel J. J., 2011. "Measuring Social Value Orientation," Judgment and Decision Making, Cambridge University Press, vol. 6(8), pages 771-781, December.
    28. Gary E. Bolton & David J. Kusterer & Johannes Mans, 2019. "Inflated Reputations: Uncertainty, Leniency, and Moral Wiggle Room in Trader Feedback Systems," Management Science, INFORMS, vol. 65(11), pages 5371-5391, November.
    29. Grosch, Kerstin & Rau, Holger A., 2017. "Gender differences in honesty: The role of social value orientation," Journal of Economic Psychology, Elsevier, vol. 62(C), pages 258-267.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Grund, Christian & Soboll, Alexandra, 2023. "Monetary Rewards, Hierarchy Level and Working Hours as Drivers of Employees' Self-Evaluations," IZA Discussion Papers 16042, Institute of Labor Economics (IZA).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ockenfels, Axel & Sliwka, Dirk & Werner, Peter, 2024. "Multi-Rater Performance Evaluations and Incentives," IZA Discussion Papers 16812, Institute of Labor Economics (IZA).
    2. Gary E. Bolton & David J. Kusterer & Johannes Mans, 2019. "Inflated Reputations: Uncertainty, Leniency, and Moral Wiggle Room in Trader Feedback Systems," Management Science, INFORMS, vol. 65(11), pages 5371-5391, November.
    3. Patrick Kampkötter & Dirk Sliwka, 2016. "The Complementary Use of Experiments and Field Data to Evaluate Management Practices: The Case of Subjective Performance Evaluations," Journal of Institutional and Theoretical Economics (JITE), Mohr Siebeck, Tübingen, vol. 172(2), pages 364-389, June.
    4. Hyndman, Kyle & Walker, Matthew J., 2022. "Fairness and risk in ultimatum bargaining," Games and Economic Behavior, Elsevier, vol. 132(C), pages 90-105.
    5. Kathrin Manthei & Dirk Sliwka, 2019. "Multitasking and Subjective Performance Evaluations: Theory and Evidence from a Field Experiment in a Bank," Management Science, INFORMS, vol. 65(12), pages 5861-5883, December.
    6. Benistant, Julien & Villeval, Marie Claire, 2019. "Unethical behavior and group identity in contests," Journal of Economic Psychology, Elsevier, vol. 72(C), pages 128-155.
    7. Wladislaw Mill & John Morgan, 2022. "The cost of a divided America: an experimental study into destructive behavior," Experimental Economics, Springer;Economic Science Association, vol. 25(3), pages 974-1001, June.
    8. De Chiara, Alessandro & Livio, Luca, 2017. "The threat of corruption and the optimal supervisory task," Journal of Economic Behavior & Organization, Elsevier, vol. 133(C), pages 172-186.
    9. Marco Kleine & Sebastian Kube, 2015. "Communication and Trust in Principal-Team Relationships: Experimental Evidence," Discussion Paper Series of the Max Planck Institute for Research on Collective Goods 2015_06, Max Planck Institute for Research on Collective Goods.
    10. Letina, Igor & Liu, Shuo & Netzer, Nick, 2020. "Delegating performance evaluation," Theoretical Economics, Econometric Society, vol. 15(2), May.
    11. Marchegiani, Lucia & Reggiani, Tommaso & Rizzolli, Matteo, 2016. "Loss averse agents and lenient supervisors in performance appraisal," Journal of Economic Behavior & Organization, Elsevier, vol. 131(PA), pages 183-197.
    12. Irene Trapp & Rouven Trapp, 2019. "The psychological effects of centrality bias: an experimental analysis," Journal of Business Economics, Springer, vol. 89(2), pages 155-189, March.
    13. Palan, Stefan & Schitter, Christian, 2018. "Prolific.ac—A subject pool for online experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 17(C), pages 22-27.
    14. Fumagalli, Elena & Rezaei, Sarah & Salomons, Anna, 2022. "OK computer: Worker perceptions of algorithmic recruitment," Research Policy, Elsevier, vol. 51(2).
    15. Nana Adrian & Ann-Kathrin Crede & Jonas Gehrlein, 2019. "Market Interaction and the Focus on Consequences in Moral Decision Making," Diskussionsschriften dp1905, Universitaet Bern, Departement Volkswirtschaft.
    16. Grosch, Kerstin & Rau, Holger A., 2017. "Do discriminatory pay regimes unleash antisocial behavior?," University of Göttingen Working Papers in Economics 315, University of Goettingen, Department of Economics.
    17. Angelovski, Andrej & Brandts, Jordi & Sola, Carles, 2016. "Hiring and escalation bias in subjective performance evaluations: A laboratory experiment," Journal of Economic Behavior & Organization, Elsevier, vol. 121(C), pages 114-129.
    18. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    19. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    20. Yifei Huang & Matt Shum & Xi Wu & Jason Zezhong Xiao, 2019. "Discovery of Bias and Strategic Behavior in Crowdsourced Performance Assessment," Papers 1908.01718, arXiv.org, revised Oct 2019.

    More about this item

    Keywords

    subjective performance evaluation; bias; bonuses; differentiation; social preferences;
    All these keywords.

    JEL classification:

    • J33 - Labor and Demographic Economics - - Wages, Compensation, and Labor Costs - - - Compensation Packages; Payment Methods
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • M52 - Business Administration and Business Economics; Marketing; Accounting; Personnel Economics - - Personnel Economics - - - Compensation and Compensation Methods and Their Effects

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izadps:dp15496. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Holger Hinte (email available below). General contact details of provider: https://edirc.repec.org/data/izaaade.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.