IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0292372.html
   My bibliography  Save this article

Effects of pay rate and instructions on attrition in crowdsourcing research

Author

Listed:
  • Carolyn M Ritchey
  • Corina Jimenez-Gomez
  • Christopher A Podlesnik

Abstract

Researchers in social sciences increasingly rely on crowdsourcing marketplaces such as Amazon Mechanical Turk (MTurk) and Prolific to facilitate rapid, low-cost data collection from large samples. However, crowdsourcing suffers from high attrition, threatening the validity of crowdsourced studies. Separate studies have demonstrated that (1) higher pay rates and (2) additional instructions–i.e., informing participants about task requirements, asking for personal information, and describing the negative impact of attrition on research quality–can reduce attrition rates with MTurk participants. The present study extended research on these possible remedies for attrition to Prolific, another crowdsourcing marketplace with strict requirements for participant pay. We randomly assigned 225 participants to one of four groups. Across groups, we evaluated effects of pay rates commensurate with or double the US minimum wage, expanding the upper range of this independent variable; two groups also received additional instructions. Higher pay reduced attrition and correlated with more accurate performance on experimental tasks but we observed no effect of additional instructions. Overall, our findings suggest that effects of increased pay on attrition generalize to higher minimum pay rates and across crowdsourcing platforms. In contrast, effects of additional instructions might not generalize across task durations, task types, or crowdsourcing platforms.

Suggested Citation

  • Carolyn M Ritchey & Corina Jimenez-Gomez & Christopher A Podlesnik, 2023. "Effects of pay rate and instructions on attrition in crowdsourcing research," PLOS ONE, Public Library of Science, vol. 18(10), pages 1-10, October.
  • Handle: RePEc:plo:pone00:0292372
    DOI: 10.1371/journal.pone.0292372
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0292372
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0292372&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0292372?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    2. Matthew J C Crump & John V McDonnell & Todd M Gureckis, 2013. "Evaluating Amazon's Mechanical Turk as a Tool for Experimental Behavioral Research," PLOS ONE, Public Library of Science, vol. 8(3), pages 1-18, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Nevo, Saggi, 2025. "Atypical entrepreneurs in the venture idea elaboration phase," Journal of Business Venturing, Elsevier, vol. 40(2).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    2. Irene Maria Buso & Daniela Di Cagno & Sofia De Caprariis & Lorenzo Ferrari & Vittorio Larocca & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2020. "The Show Must Go On: How to Elicit Lablike Data on the Effects of COVID-19 Lockdown on Fairness and Cooperation," Working Papers CESARE 2/2020, Dipartimento di Economia e Finanza, LUISS Guido Carli.
    3. Nathan W. Chan & Stephen Knowles & Ronald Peeters & Leonard Wolk, 2024. "Cost-(in)effective public good provision: an experimental exploration," Theory and Decision, Springer, vol. 96(3), pages 397-442, May.
    4. Dato, Simon & Feess, Eberhard & Nieken, Petra, 2019. "Lying and reciprocity," Games and Economic Behavior, Elsevier, vol. 118(C), pages 193-218.
    5. Burdea, Valeria & Woon, Jonathan, 2022. "Online belief elicitation methods," Journal of Economic Psychology, Elsevier, vol. 90(C).
    6. Yulin Hswen & Ulrich Nguemdjo & Elad Yom-Tov & Gregory M Marcus & Bruno Ventelou, 2022. "Individuals’ willingness to provide geospatial global positioning system (GPS) data from their smartphone during the COVID-19 pandemic," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-8, December.
    7. Bajoori, Elnaz & Peeters, Ronald & Wolk, Leonard, 2024. "Security auctions with cash- and equity-bids: An experimental study," European Economic Review, Elsevier, vol. 163(C).
    8. Irene Maria Buso & Daniela Di Cagno & Sofia De Caprariis & Lorenzo Ferrari & Vittorio Larocca & Luisa Lorè & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2020. "Lab-like Findings of Non-Lab Experiments: a Methodological Proposal and Validation," Working Papers CESARE 3/2020, Dipartimento di Economia e Finanza, LUISS Guido Carli.
    9. Palan, Stefan & Schitter, Christian, 2018. "Prolific.ac—A subject pool for online experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 17(C), pages 22-27.
    10. Chan, Nathan W. & Knowles, Stephen & Peeters, Ronald & Wolk, Leonard, 2024. "On generosity in public good and charitable dictator games," Journal of Economic Behavior & Organization, Elsevier, vol. 224(C), pages 624-640.
    11. Yulin Hswen & Ulrich Nguemdjo & Elad Yom-Tov & Gregory M Marcus & Bruno Ventelou, 2022. "Individuals’ willingness to provide geospatial global positioning system (GPS) data from their smartphone during the COVID-19 pandemic," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-8, December.
    12. Elnaz Bajoori & Leonard Wolk & Ronald Peeters, 2019. "Security auctions with cash- and equity-bids: An experimental study," Department of Economics Working Papers 58167, University of Bath, Department of Economics, revised 08 Mar 2023.
    13. Irene Maria Buso & Daniela Di Cagno & Lorenzo Ferrari & Vittorio Larocca & Luisa Lorè & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2021. "Lab-like findings from online experiments," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 184-193, December.
    14. Feess, Eberhard & Schilling, Thomas & Timofeyev, Yuriy, 2023. "Misreporting in teams with individual decision making: The impact of information and communication," Journal of Economic Behavior & Organization, Elsevier, vol. 209(C), pages 509-532.
    15. Ronayne, David & Sgroi, Daniel & Tuckwell, Anthony, 2021. "Evaluating the sunk cost effect," Journal of Economic Behavior & Organization, Elsevier, vol. 186(C), pages 318-327.
    16. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    17. Kas, Judith, 2022. "The effect of online reputation systems on intergroup inequality," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 96(C).
    18. Paulina Krzywicka & Katarzyna Byrka, 2020. "The Effect of Animate-Inanimate Soundscapes and Framing on Environments’ Evaluation and Predicted Recreation Time," IJERPH, MDPI, vol. 17(23), pages 1-16, December.
    19. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    20. Pablo Brañas-Garza & Diego Jorrat & Antonio M. Espín & Angel Sánchez, 2023. "Paid and hypothetical time preferences are the same: lab, field and online evidence," Experimental Economics, Springer;Economic Science Association, vol. 26(2), pages 412-434, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0292372. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.