IDEAS home Printed from https://ideas.repec.org/a/eee/jeborg/v90y2013icp123-133.html

Breaking monotony with meaning: Motivation in crowdsourcing markets

Author

Listed:
  • Chandler, Dana
  • Kapelner, Adam

Abstract

We conduct the first natural field experiment to explore the relationship between the “meaningfulness” of a task and worker effort. We employed about 2500 workers from Amazon's Mechanical Turk (MTurk), an online labor market, to label medical images. Although given an identical task, we experimentally manipulated how the task was framed. Subjects in the meaningful treatment were told that they were labeling tumor cells in order to assist medical researchers, subjects in the zero-context condition (the control group) were not told the purpose of the task, and, in stark contrast, subjects in the shredded treatment were not given context and were additionally told that their work would be discarded. We found that when a task was framed more meaningfully, workers were more likely to participate. We also found that the meaningful treatment increased the quantity of output (with an insignificant change in quality) while the shredded treatment decreased the quality of output (with no change in quantity). We believe these results will generalize to other short-term labor markets. Our study also discusses MTurk as an exciting platform for running natural field experiments in economics.

Suggested Citation

  • Chandler, Dana & Kapelner, Adam, 2013. "Breaking monotony with meaning: Motivation in crowdsourcing markets," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 123-133.
  • Handle: RePEc:eee:jeborg:v:90:y:2013:i:c:p:123-133
    DOI: 10.1016/j.jebo.2013.03.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S016726811300036X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jebo.2013.03.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Paolacci, Gabriele & Chandler, Jesse & Ipeirotis, Panagiotis G., 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Cambridge University Press, vol. 5(5), pages 411-419, August.
    3. Ariely, Dan & Kamenica, Emir & Prelec, Drazen, 2008. "Man's search for meaning: The case of Legos," Journal of Economic Behavior & Organization, Elsevier, vol. 67(3-4), pages 671-677, September.
    4. Kimmo Eriksson & Brent Simpson, 2010. "Emotional reactions to losing explain gender differences in entering a risky lottery," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 5(3), pages 159-163, June.
    5. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    6. Joseph Henrich & Steve J. Heine & Ara Norenzayan, 2010. "The Weirdest People in the World?," RatSWD Working Papers 139, German Data Forum (RatSWD).
    7. Uri Gneezy & John A List, 2006. "Putting Behavioral Economics to Work: Testing for Gift Exchange in Labor Markets Using Field Experiments," Econometrica, Econometric Society, vol. 74(5), pages 1365-1384, September.
    8. repec:feb:artefa:0087 is not listed on IDEAS
    9. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    10. Scott Stern, 2004. "Do Scientists Pay to Be Scientists?," Management Science, INFORMS, vol. 50(6), pages 835-853, June.
    11. Angrist, Joshua D, 2001. "Estimations of Limited Dependent Variable Models with Dummy Endogenous Regressors: Simple Strategies for Empirical Practice," Journal of Business & Economic Statistics, American Statistical Association, vol. 19(1), pages 2-16, January.
    12. Gabriele Paolacci & Jesse Chandler & Panagiotis G. Ipeirotis, 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 5(5), pages 411-419, August.
    13. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    14. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    15. Rachel Croson & Uri Gneezy, 2009. "Gender Differences in Preferences," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 448-474, June.
    16. Preston, Anne E, 1989. "The Nonprofit Worker in a For-Profit World," Journal of Labor Economics, University of Chicago Press, vol. 7(4), pages 438-463, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Vanessa C. Burbano, 2016. "Social Responsibility Messages and Worker Wage Requirements: Field Experimental Evidence from Online Labor Marketplaces," Organization Science, INFORMS, vol. 27(4), pages 1010-1028, August.
    3. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    4. Tim Straub & Henner Gimpel & Florian Teschner & Christof Weinhardt, 2015. "How (not) to Incent Crowd Workers," Business & Information Systems Engineering: The International Journal of WIRTSCHAFTSINFORMATIK, Springer;Gesellschaft für Informatik e.V. (GI), vol. 57(3), pages 167-179, June.
    5. Hiroki Ozono & Daisuke Nakama, 2022. "Effects of experimental situation on group cooperation and individual performance: Comparing laboratory and online experiments," PLOS ONE, Public Library of Science, vol. 17(4), pages 1-17, April.
    6. Eriksen, Kristoffer W. & Fest, Sebastian & Kvaløy, Ola & Dijk, Oege, 2022. "Fair advice," Journal of Banking & Finance, Elsevier, vol. 143(C).
    7. Atalay, Kadir & Bakhtiar, Fayzan & Cheung, Stephen & Slonim, Robert, 2014. "Savings and prize-linked savings accounts," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 86-106.
    8. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    9. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    10. Cantarella, Michele & Strozzi, Chiara, 2019. "Workers in the Crowd: The Labour Market Impact of the Online Platform Economy," IZA Discussion Papers 12327, IZA Network @ LISER.
    11. Azzam, Tarek & Harman, Elena, 2016. "Crowdsourcing for quantifying transcripts: An exploratory study," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 63-73.
    12. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    13. Wladislaw Mill & Cornelius Schneider, 2023. "The Bright Side of Tax Evasion," CESifo Working Paper Series 10615, CESifo.
    14. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    15. Valerio Capraro & Hélène Barcelo, 2021. "Punishing defectors and rewarding cooperators: Do people discriminate between genders?," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(1), pages 19-32, September.
    16. Gupta, Vishal K. & Goktan, A. Banu & Gunay, Gonca, 2014. "Gender differences in evaluation of new business opportunity: A stereotype threat perspective," Journal of Business Venturing, Elsevier, vol. 29(2), pages 273-288.
    17. Douglas Davis, 2016. "Experimental Methods for the General Economist: Five Lessons from the Lab," Southern Economic Journal, John Wiley & Sons, vol. 82(4), pages 1046-1058, April.
    18. Lefgren, Lars J. & Sims, David P. & Stoddard, Olga B., 2016. "Effort, luck, and voting for redistribution," Journal of Public Economics, Elsevier, vol. 143(C), pages 89-97.
    19. Bhatt, Vipul & Smith, Angela M., 2025. "Overconfidence and performance: Evidence from a simple real-effort task," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 114(C).
    20. Handberg, Øyvind Nystad & Angelsen, Arild, 2015. "Experimental tests of tropical forest conservation measures," Journal of Economic Behavior & Organization, Elsevier, vol. 118(C), pages 346-359.

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jeborg:v:90:y:2013:i:c:p:123-133. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/jebo .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.