IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0292372.html
   My bibliography  Save this article

Effects of pay rate and instructions on attrition in crowdsourcing research

Author

Listed:
  • Carolyn M Ritchey
  • Corina Jimenez-Gomez
  • Christopher A Podlesnik

Abstract

Researchers in social sciences increasingly rely on crowdsourcing marketplaces such as Amazon Mechanical Turk (MTurk) and Prolific to facilitate rapid, low-cost data collection from large samples. However, crowdsourcing suffers from high attrition, threatening the validity of crowdsourced studies. Separate studies have demonstrated that (1) higher pay rates and (2) additional instructions–i.e., informing participants about task requirements, asking for personal information, and describing the negative impact of attrition on research quality–can reduce attrition rates with MTurk participants. The present study extended research on these possible remedies for attrition to Prolific, another crowdsourcing marketplace with strict requirements for participant pay. We randomly assigned 225 participants to one of four groups. Across groups, we evaluated effects of pay rates commensurate with or double the US minimum wage, expanding the upper range of this independent variable; two groups also received additional instructions. Higher pay reduced attrition and correlated with more accurate performance on experimental tasks but we observed no effect of additional instructions. Overall, our findings suggest that effects of increased pay on attrition generalize to higher minimum pay rates and across crowdsourcing platforms. In contrast, effects of additional instructions might not generalize across task durations, task types, or crowdsourcing platforms.

Suggested Citation

  • Carolyn M Ritchey & Corina Jimenez-Gomez & Christopher A Podlesnik, 2023. "Effects of pay rate and instructions on attrition in crowdsourcing research," PLOS ONE, Public Library of Science, vol. 18(10), pages 1-10, October.
  • Handle: RePEc:plo:pone00:0292372
    DOI: 10.1371/journal.pone.0292372
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0292372
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0292372&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0292372?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    2. Matthew J C Crump & John V McDonnell & Todd M Gureckis, 2013. "Evaluating Amazon's Mechanical Turk as a Tool for Experimental Behavioral Research," PLOS ONE, Public Library of Science, vol. 8(3), pages 1-18, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Nevo, Saggi, 2025. "Atypical entrepreneurs in the venture idea elaboration phase," Journal of Business Venturing, Elsevier, vol. 40(2).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dato, Simon & Feess, Eberhard & Nieken, Petra, 2019. "Lying and reciprocity," Games and Economic Behavior, Elsevier, vol. 118(C), pages 193-218.
    2. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    3. Bajoori, Elnaz & Peeters, Ronald & Wolk, Leonard, 2024. "Security auctions with cash- and equity-bids: An experimental study," European Economic Review, Elsevier, vol. 163(C).
    4. Yulin Hswen & Ulrich Nguemdjo & Elad Yom-Tov & Gregory M Marcus & Bruno Ventelou, 2022. "Individuals’ willingness to provide geospatial global positioning system (GPS) data from their smartphone during the COVID-19 pandemic," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-8, December.
    5. Irene Maria Buso & Daniela Di Cagno & Sofia De Caprariis & Lorenzo Ferrari & Vittorio Larocca & Luisa Lorè & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2020. "Lab-like Findings of Non-Lab Experiments: a Methodological Proposal and Validation," Working Papers CESARE 3/2020, Dipartimento di Economia e Finanza, LUISS Guido Carli.
    6. Irene Maria Buso & Daniela Di Cagno & Sofia De Caprariis & Lorenzo Ferrari & Vittorio Larocca & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2020. "The Show Must Go On: How to Elicit Lablike Data on the Effects of COVID-19 Lockdown on Fairness and Cooperation," Working Papers CESARE 2/2020, Dipartimento di Economia e Finanza, LUISS Guido Carli.
    7. Palan, Stefan & Schitter, Christian, 2018. "Prolific.ac—A subject pool for online experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 17(C), pages 22-27.
    8. Nathan W. Chan & Stephen Knowles & Ronald Peeters & Leonard Wolk, 2024. "Cost-(in)effective public good provision: an experimental exploration," Theory and Decision, Springer, vol. 96(3), pages 397-442, May.
    9. Burdea, Valeria & Woon, Jonathan, 2022. "Online belief elicitation methods," Journal of Economic Psychology, Elsevier, vol. 90(C).
    10. Chan, Nathan W. & Knowles, Stephen & Peeters, Ronald & Wolk, Leonard, 2024. "On generosity in public good and charitable dictator games," Journal of Economic Behavior & Organization, Elsevier, vol. 224(C), pages 624-640.
    11. Elnaz Bajoori & Leonard Wolk & Ronald Peeters, 2019. "Security auctions with cash- and equity-bids: An experimental study," Department of Economics Working Papers 58167, University of Bath, Department of Economics, revised 08 Mar 2023.
    12. Irene Maria Buso & Daniela Di Cagno & Lorenzo Ferrari & Vittorio Larocca & Luisa Lorè & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2021. "Lab-like findings from online experiments," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 184-193, December.
    13. Gary Bolton & Eugen Dimant & Ulrich Schmidt, 2018. "When a Nudge Backfires. Using Observation with Social and Economic Incentives to Promote Pro-Social Behavior," PPE Working Papers 0017, Philosophy, Politics and Economics, University of Pennsylvania.
    14. C. Mónica Capra & Bing Jiang & Yuxin Su, 2022. "Do pledges lead to more volunteering? An experimental study," Economic Inquiry, Western Economic Association International, vol. 60(1), pages 87-100, January.
    15. Maude Lavanchy & Patrick Reichert & Jayanth Narayanan & Krishna Savani, 2023. "Applicants’ Fairness Perceptions of Algorithm-Driven Hiring Procedures," Journal of Business Ethics, Springer, vol. 188(1), pages 125-150, November.
    16. Hindsley, Paul & McEvoy, David M. & Morgan, O. Ashton, 2020. "Consumer Demand for Ethical Products and the Role of Cultural Worldviews: The Case of Direct-Trade Coffee," Ecological Economics, Elsevier, vol. 177(C).
    17. Capraro, Valerio & Rodriguez-Lara, Ismael & Ruiz-Martos, Maria J., 2020. "Preferences for efficiency, rather than preferences for morality, drive cooperation in the one-shot Stag-Hunt game," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 86(C).
    18. Bernard, Kévin & Bonein, Aurélie & Bougherara, Douadia, 2020. "Consumer inequality aversion and risk preferences in community supported agriculture," Ecological Economics, Elsevier, vol. 175(C).
    19. Feess, Eberhard & Schilling, Thomas & Timofeyev, Yuriy, 2023. "Misreporting in teams with individual decision making: The impact of information and communication," Journal of Economic Behavior & Organization, Elsevier, vol. 209(C), pages 509-532.
    20. Grewenig, Elisabeth & Lergetporer, Philipp & Werner, Katharina & Woessmann, Ludger, 2022. "Incentives, search engines, and the elicitation of subjective beliefs: Evidence from representative online survey experiments," Journal of Econometrics, Elsevier, vol. 231(1), pages 304-326.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0292372. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.