IDEAS home Printed from https://ideas.repec.org/a/eee/jeborg/v90y2013icp123-133.html
   My bibliography  Save this article

Breaking monotony with meaning: Motivation in crowdsourcing markets

Author

Listed:
  • Chandler, Dana
  • Kapelner, Adam

Abstract

We conduct the first natural field experiment to explore the relationship between the “meaningfulness” of a task and worker effort. We employed about 2500 workers from Amazon's Mechanical Turk (MTurk), an online labor market, to label medical images. Although given an identical task, we experimentally manipulated how the task was framed. Subjects in the meaningful treatment were told that they were labeling tumor cells in order to assist medical researchers, subjects in the zero-context condition (the control group) were not told the purpose of the task, and, in stark contrast, subjects in the shredded treatment were not given context and were additionally told that their work would be discarded. We found that when a task was framed more meaningfully, workers were more likely to participate. We also found that the meaningful treatment increased the quantity of output (with an insignificant change in quality) while the shredded treatment decreased the quality of output (with no change in quantity). We believe these results will generalize to other short-term labor markets. Our study also discusses MTurk as an exciting platform for running natural field experiments in economics.

Suggested Citation

  • Chandler, Dana & Kapelner, Adam, 2013. "Breaking monotony with meaning: Motivation in crowdsourcing markets," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 123-133.
  • Handle: RePEc:eee:jeborg:v:90:y:2013:i:c:p:123-133
    DOI: 10.1016/j.jebo.2013.03.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S016726811300036X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jebo.2013.03.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Angrist, Joshua D, 2001. "Estimations of Limited Dependent Variable Models with Dummy Endogenous Regressors: Simple Strategies for Empirical Practice," Journal of Business & Economic Statistics, American Statistical Association, vol. 19(1), pages 2-16, January.
    3. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    4. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    5. Preston, Anne E, 1989. "The Nonprofit Worker in a For-Profit World," Journal of Labor Economics, University of Chicago Press, vol. 7(4), pages 438-463, October.
    6. Joseph Henrich & Steve J. Heine & Ara Norenzayan, 2010. "The Weirdest People in the World?," RatSWD Working Papers 139, German Data Forum (RatSWD).
    7. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    8. Scott Stern, 2004. "Do Scientists Pay to Be Scientists?," Management Science, INFORMS, vol. 50(6), pages 835-853, June.
    9. Uri Gneezy & John A List, 2006. "Putting Behavioral Economics to Work: Testing for Gift Exchange in Labor Markets Using Field Experiments," Econometrica, Econometric Society, vol. 74(5), pages 1365-1384, September.
    10. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    11. Rachel Croson & Uri Gneezy, 2009. "Gender Differences in Preferences," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 448-474, June.
    12. Ariely, Dan & Kamenica, Emir & Prelec, Drazen, 2008. "Man's search for meaning: The case of Legos," Journal of Economic Behavior & Organization, Elsevier, vol. 67(3-4), pages 671-677, September.
    13. repec:feb:artefa:0087 is not listed on IDEAS
    14. repec:cup:judgdm:v:5:y:2010:i:3:p:159-163 is not listed on IDEAS
    15. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Vanessa C. Burbano, 2016. "Social Responsibility Messages and Worker Wage Requirements: Field Experimental Evidence from Online Labor Marketplaces," Organization Science, INFORMS, vol. 27(4), pages 1010-1028, August.
    3. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    4. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    5. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    6. Douglas Davis, 2016. "Experimental Methods for the General Economist: Five Lessons from the Lab," Southern Economic Journal, John Wiley & Sons, vol. 82(4), pages 1046-1058, April.
    7. Handberg, Øyvind Nystad & Angelsen, Arild, 2015. "Experimental tests of tropical forest conservation measures," Journal of Economic Behavior & Organization, Elsevier, vol. 118(C), pages 346-359.
    8. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    9. Nicolas Jacquemet & Olivier L’Haridon & Isabelle Vialle, 2014. "Marché du travail, évaluation et économie expérimentale," Revue française d'économie, Presses de Sciences-Po, vol. 0(1), pages 189-226.
    10. Handberg, Øyvind Nystad & Angelsen, Arild, 2019. "Pay little, get little; pay more, get a little more: A framed forest experiment in Tanzania," Ecological Economics, Elsevier, vol. 156(C), pages 454-467.
    11. Jonathan H.W. Tan & Zhao Zichen & Daniel John Zizzo, 2023. "Scientific Inference from Field and Laboratory Economic Experiments: Empirical Evidence," Discussion Papers Series 663, School of Economics, University of Queensland, Australia.
    12. Goeschl, Timo & Kettner, Sara Elisa & Lohse, Johannes & Schwieren, Christiane, 2015. "What do we learn from public good games about voluntary climate action? Evidence from an artefactual field experiment," Working Papers 0595, University of Heidelberg, Department of Economics.
    13. Goeschl, Timo & Kettner, Sara Elisa & Lohse, Johannes & Schwieren, Christiane, 2020. "How much can we learn about voluntary climate action from behavior in public goods games?," Ecological Economics, Elsevier, vol. 171(C).
    14. Gruener, Sven & Lehberger, Mira & Hirschauer, Norbert & Mußhoff, Oliver, 2021. "How (un-)informative are experiments with “standard subjects” for other social groups? – The case of agricultural students and farmers," SocArXiv psda5, Center for Open Science.
    15. Adrian Chadi & Sabrina Jeworrek & Vanessa Mertins, 2017. "When the Meaning of Work Has Disappeared: Experimental Evidence on Employees’ Performance and Emotions," Management Science, INFORMS, vol. 63(6), pages 1696-1707, June.
    16. John A. List, 2024. "Optimally generate policy-based evidence before scaling," Nature, Nature, vol. 626(7999), pages 491-499, February.
    17. Bouma, J.A. & Nguyen, Binh & van der Heijden, Eline & Dijk, J.J., 2018. "Analysing Group Contract Design Using a Lab and a Lab-in-the-Field Threshold Public Good Experiment," Discussion Paper 2018-049, Tilburg University, Center for Economic Research.
    18. Stefano DellaVigna, 2009. "Psychology and Economics: Evidence from the Field," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 315-372, June.
    19. Chen, Daniel L. & Schonger, Martin & Wickens, Chris, 2016. "oTree—An open-source platform for laboratory, online, and field experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 9(C), pages 88-97.
    20. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jeborg:v:90:y:2013:i:c:p:123-133. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/jebo .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.