IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0252604.html
   My bibliography  Save this article

An experimental characterization of workers’ behavior and accuracy in crowdsourced tasks

Author

Listed:
  • Evgenia Christoforou
  • Antonio Fernández Anta
  • Angel Sánchez

Abstract

Crowdsourcing systems are evolving into a powerful tool of choice to deal with repetitive or lengthy human-based tasks. Prominent among those is Amazon Mechanical Turk, in which Human Intelligence Tasks, are posted by requesters, and afterwards selected and executed by subscribed (human) workers in the platform. Many times these HITs serve for research purposes. In this context, a very important question is how reliable the results obtained through these platforms are, in view of the limited control a requester has on the workers’ actions. Various control techniques are currently proposed but they are not free from shortcomings, and their use must be accompanied by a deeper understanding of the workers’ behavior. In this work, we attempt to interpret the workers’ behavior and reliability level in the absence of control techniques. To do so, we perform a series of experiments with 600 distinct MTurk workers, specifically designed to elicit the worker’s level of dedication to a task, according to the task’s nature and difficulty. We show that the time required by a worker to carry out a task correlates with its difficulty, and also with the quality of the outcome. We find that there are different types of workers. While some of them are willing to invest a significant amount of time to arrive at the correct answer, at the same time we observe a significant fraction of workers that reply with a wrong answer. For the latter, the difficulty of the task and the very short time they took to reply suggest that they, intentionally, did not even attempt to solve the task.

Suggested Citation

  • Evgenia Christoforou & Antonio Fernández Anta & Angel Sánchez, 2021. "An experimental characterization of workers’ behavior and accuracy in crowdsourced tasks," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-14, June.
  • Handle: RePEc:plo:pone00:0252604
    DOI: 10.1371/journal.pone.0252604
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0252604
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0252604&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0252604?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yamada, Katsunori & Sato, Masayuki, 2013. "Another avenue for anatomy of income comparisons: Evidence from hypothetical choice experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 89(C), pages 35-57.
    2. Lechthaler, Wolfgang & Ring, Patrick, 2021. "Labor force participation, job search effort and unemployment insurance in the laboratory," Journal of Economic Behavior & Organization, Elsevier, vol. 189(C), pages 748-778.
    3. Heinicke, Franziska & Rosenkranz, Stephanie & Weitzel, Utz, 2019. "The effect of pledges on the distribution of lying behavior: An online experiment," Journal of Economic Psychology, Elsevier, vol. 73(C), pages 136-151.
    4. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    5. Jean-Marc Bourgeon & José de Sousa & Alexis Noir-Luhalwe, 2022. "Social Distancing and Risk Taking: Evidence from a Team Game Show [Distanciation sociale et prise de risque : Les résultats d'un jeu d'équipe]," SciencePo Working papers Main hal-03792423, HAL.
    6. Mariconda, Simone & Lurati, Francesco, 2015. "Does familiarity breed stability? The role of familiarity in moderating the effects of new information on reputation judgments," Journal of Business Research, Elsevier, vol. 68(5), pages 957-964.
    7. Ingar Haaland & Christopher Roth & Johannes Wohlfart, 2023. "Designing Information Provision Experiments," Journal of Economic Literature, American Economic Association, vol. 61(1), pages 3-40, March.
    8. Simon Gächter & Lingbo Huang & Martin Sefton, 2016. "Combining “real effort” with induced effort costs: the ball-catching task," Experimental Economics, Springer;Economic Science Association, vol. 19(4), pages 687-712, December.
    9. Masha Shunko & Julie Niederhoff & Yaroslav Rosokha, 2018. "Humans Are Not Machines: The Behavioral Impact of Queueing Design on Service Time," Management Science, INFORMS, vol. 64(1), pages 453-473, January.
    10. L. Mundaca & H. Moncreiff, 2021. "New Perspectives on Green Energy Defaults," Journal of Consumer Policy, Springer, vol. 44(3), pages 357-383, September.
    11. Sandro Ambuehl & B. Douglas Bernheim & Annamaria Lusardi, 2022. "Evaluating Deliberative Competence: A Simple Method with an Application to Financial Choice," American Economic Review, American Economic Association, vol. 112(11), pages 3584-3626, November.
    12. Chen, Daniel L. & Schonger, Martin & Wickens, Chris, 2016. "oTree—An open-source platform for laboratory, online, and field experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 9(C), pages 88-97.
    13. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," IZA Discussion Papers 15478, Institute of Labor Economics (IZA).
    14. Guenther, Isabel & Tetteh-Baah, Samuel Kofi, 2019. "The impact of discrimination on redistributive preferences and productivity: experimental evidence from the United States," VfS Annual Conference 2019 (Leipzig): 30 Years after the Fall of the Berlin Wall - Democracy and Market Economy 203652, Verein für Socialpolitik / German Economic Association.
    15. Matthew C. Weinzierl, 2016. "A Welfarist Role for Nonwelfarist Rules: An example with envy," Harvard Business School Working Papers 17-021, Harvard Business School, revised Jul 2017.
    16. Jeanette A.M.J. Deetlefs & Mathew Chylinski & Andreas Ortmann, 2015. "MTurk ‘Unscrubbed’: Exploring the good, the ‘Super’, and the unreliable on Amazon’s Mechanical Turk," Discussion Papers 2015-20, School of Economics, The University of New South Wales.
    17. Jérôme Hergueux & Nicolas Jacquemet, 2015. "Social preferences in the online laboratory: a randomized experiment," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 251-283, June.
    18. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    19. Cantarella, Michele & Strozzi, Chiara, 2019. "Workers in the Crowd: The Labour Market Impact of the Online Platform Economy," IZA Discussion Papers 12327, Institute of Labor Economics (IZA).
    20. Atalay, Kadir & Bakhtiar, Fayzan & Cheung, Stephen & Slonim, Robert, 2014. "Savings and prize-linked savings accounts," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 86-106.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0252604. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.