IDEAS home Printed from https://ideas.repec.org/p/pra/mprapa/88450.html

Amazon Mechanical Turk Workers Can Provide Consistent and Economically Meaningful Data

Author

Listed:
  • Johnson, David
  • Ryan, John

Abstract

We explore the consistency of the characteristics of individuals who participate in studies posted on Amazon Mechanical Turk (AMT). The primary individuals analyzed in this study are subjects who participated in at least two of eleven experiments that were run on AMT between September of 2012 to January of 2018. We demonstrate subjects consistently report a series of demographic and personality characteristics. Further, subjective willingness to take risk is found to be significantly correlated with decisions made in a simple lottery experiment with real stakes - even when the subjective risk measure is reported months, sometimes years, in the past. This suggests the quality of data obtained via AMT is not significantly harmed by the lack of control over the conditions under which the responses are recorded.

Suggested Citation

  • Johnson, David & Ryan, John, 2018. "Amazon Mechanical Turk Workers Can Provide Consistent and Economically Meaningful Data," MPRA Paper 88450, University Library of Munich, Germany.
  • Handle: RePEc:pra:mprapa:88450
    as

    Download full text from publisher

    File URL: https://mpra.ub.uni-muenchen.de/88450/1/MPRA_paper_88450.pdf
    File Function: original version
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Mullinix, Kevin J. & Leeper, Thomas J. & Druckman, James N. & Freese, Jeremy, 2015. "The Generalizability of Survey Experiments," Journal of Experimental Political Science, Cambridge University Press, vol. 2(2), pages 109-138, January.
    2. John Gibson & David Johnson, 2019. "Are Online Samples Credible? Evidence from Risk Elicitation Tests," Atlantic Economic Journal, Springer;International Atlantic Economic Society, vol. 47(3), pages 377-379, September.
    3. Charness, Gary & Viceisza, Angelino, 2016. "Three Risk-elicitation Methods in the Field - Evidence from Rural Senegal," Review of Behavioral Economics, now publishers, vol. 3(2), pages 145-171, July.
    4. Paolo Crosetto & Antonio Filippin, 2016. "A theoretical and experimental appraisal of four risk elicitation methods," Experimental Economics, Springer;Economic Science Association, vol. 19(3), pages 613-641, September.
    5. Catherine C. Eckel, 2019. "Measuring individual risk preferences," IZA World of Labor, Institute of Labor Economics (IZA), pages 454-454, June.
    6. Benjamin Beranek & Robin Cubitt & Simon Gächter, 2015. "Stated and revealed inequality aversion in three subject pools," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 1(1), pages 43-58, July.
    7. Thomas Dohmen & Armin Falk & David Huffman & Uwe Sunde & Jürgen Schupp & Gert G. Wagner, 2011. "Individual Risk Attitudes: Measurement, Determinants, And Behavioral Consequences," Journal of the European Economic Association, European Economic Association, vol. 9(3), pages 522-550, June.
    8. Gabriele Paolacci & Jesse Chandler & Panagiotis G. Ipeirotis, 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 5(5), pages 411-419, August.
    9. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    10. Giuseppe Attanasi & Nikolaos Georgantzís & Valentina Rotondi & Daria Vigani, 2018. "Lottery- and survey-based risk attitudes linked through a multichoice elicitation task," Theory and Decision, Springer, vol. 84(3), pages 341-372, May.
    11. Armin Falk & Anke Becker & Thomas Dohmen & David Huffman & Uwe Sunde, 2023. "The Preference Survey Module: A Validated Instrument for Measuring Risk, Time, and Social Preferences," Management Science, INFORMS, vol. 69(4), pages 1935-1950, April.
    12. Helena Szrek & Li-Wei Chao & Shandir Ramlagan & Karl Peltzer, 2012. "Predicting (un)healthy behavior: A comparison of risk-taking propensity measures," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 7(6), pages 716-727, November.
    13. David B. Johnson & Matthew D. Webb, 2016. "Decision Making with Risky, Rival Outcomes: Theory and Evidence," Carleton Economic Papers 16-12, Carleton University, Department of Economics.
    14. Peter P J L Verkoeijen & Samantha Bouwmeester, 2014. "Does Intuition Cause Cooperation?," PLOS ONE, Public Library of Science, vol. 9(5), pages 1-8, May.
    15. Paolacci, Gabriele & Chandler, Jesse & Ipeirotis, Panagiotis G., 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Cambridge University Press, vol. 5(5), pages 411-419, August.
    16. Lönnqvist, Jan-Erik & Verkasalo, Markku & Walkowitz, Gari & Wichardt, Philipp C., 2015. "Measuring individual risk attitudes in the lab: Task or ask? An empirical comparison," Journal of Economic Behavior & Organization, Elsevier, vol. 119(C), pages 254-266.
    17. Andreas Pedroni & Renato Frey & Adrian Bruhin & Gilles Dutilh & Ralph Hertwig & Jörg Rieskamp, 2017. "The risk elicitation puzzle," Nature Human Behaviour, Nature, vol. 1(11), pages 803-809, November.
    18. repec:nas:journl:v:115:y:2018:p:12441-12446 is not listed on IDEAS
    19. Kevin E. Levay & Jeremy Freese & James N. Druckman, 2016. "The Demographic and Political Composition of Mechanical Turk Samples," SAGE Open, , vol. 6(1), pages 21582440166, March.
    20. Galizzi, Matteo M. & Machado, Sara R. & Miniaci, Raffaele, 2016. "Temporal stability, cross-validity, and external validity of risk preferences measures: experimental evidence from a UK representative sample," LSE Research Online Documents on Economics 67554, London School of Economics and Political Science, LSE Library.
    21. Antonio A. Arechar & Gordon T. Kraft-Todd & David G. Rand, 2017. "Turking overtime: how participant characteristics and behavior vary over time and day on Amazon Mechanical Turk," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 3(1), pages 1-11, July.
    22. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    23. Chetan Dave & Catherine Eckel & Cathleen Johnson & Christian Rojas, 2010. "Eliciting risk preferences: When is simple better?," Journal of Risk and Uncertainty, Springer, vol. 41(3), pages 219-243, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Evangelos Mourelatos & Jaakko Simonen & Simo Hosio & Daniil Likhobaba & Dmitry Ustalov, 2024. "How has the COVID-19 pandemic shaped behavior in crowdsourcing? The role of online labor market training," Journal of Business Economics, Springer, vol. 94(9), pages 1201-1244, November.
    2. repec:osf:metaar:a9vhr_v1 is not listed on IDEAS
    3. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    4. Rene Schwaiger & Laura Hueber, 2021. "Do MTurkers Exhibit Myopic Loss Aversion?," Working Papers 2021-12, Faculty of Economics and Statistics, Universität Innsbruck.
    5. Dominik J. Wettstein & Stefan Boes, 2020. "The impact of reimbursement negotiations on cost and availability of new pharmaceuticals: evidence from an online experiment," Health Economics Review, Springer, vol. 10(1), pages 1-15, December.
    6. Chambers, Catherine & Chambers, Paul & Johnson, David, 2025. "Charismatic species, matching, and demographics in conservation donations: An experimental investigation," Ecological Economics, Elsevier, vol. 230(C).
    7. David Chavanne & Zak Danz & Jitu Dribssa & Rachel Powell & Matthew Sambor, 2022. "Context and the Perceived Fairness of Price Increases Coming out of COVID‐19," Social Science Quarterly, Southwestern Social Science Association, vol. 103(1), pages 55-68, January.
    8. John Gibson & David Johnson, 2021. "Breaking Bad: When Being Disadvantaged Incentivizes (Seemingly) Risky Behavior," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 47(1), pages 107-134, January.
    9. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," IZA Discussion Papers 15478, Institute of Labor Economics (IZA).
    10. Charness, Gary & Dao, Lien & Shurchkov, Olga, 2022. "Competing now and then: The effects of delay on competitiveness across gender," Journal of Economic Behavior & Organization, Elsevier, vol. 198(C), pages 612-630.
    11. Scott Simon Boddery & Damon Cann & Laura Moyer & Jeff Yates, 2023. "The role of cable news hosts in public support for Supreme Court decisions," Journal of Empirical Legal Studies, John Wiley & Sons, vol. 20(4), pages 1045-1069, December.
    12. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    13. Peilu Zhang & Marco A. Palma, 2021. "Compulsory Versus Voluntary Insurance: An Online Experiment," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(1), pages 106-125, January.
    14. Joanna Lahey & Roberto Mosquera, 2024. "Age and hiring for high school graduate Hispanics in the United States," Journal of Population Economics, Springer;European Society for Population Economics, vol. 37(1), pages 1-40, March.
    15. John Gibson & David Johnson, 0. "Breaking Bad: When Being Disadvantaged Incentivizes (Seemingly) Risky Behavior," Eastern Economic Journal, Palgrave Macmillan;Eastern Economic Association, vol. 0, pages 1-28.
    16. Karl van der Schyff & Greg Foster & Karen Renaud & Stephen Flowerday, 2023. "Online Privacy Fatigue: A Scoping Review and Research Agenda," Future Internet, MDPI, vol. 15(5), pages 1-31, April.
    17. Luke Fowler & Stephen Utych, 2021. "Are people better employees than machines? Dehumanizing language and employee performance appraisals," Social Science Quarterly, Southwestern Social Science Association, vol. 102(4), pages 2006-2019, July.
    18. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    19. Johannes G. Jaspersen & Marc A. Ragin & Justin R. Sydnor, 2022. "Insurance demand experiments: Comparing crowdworking to the lab," Journal of Risk & Insurance, The American Risk and Insurance Association, vol. 89(4), pages 1077-1107, December.
    20. Kaitlynn Sandstrom‐Mistry & Frank Lupi & Hyunjung Kim & Joseph A. Herriges, 2023. "Comparing water quality valuation across probability and non‐probability samples," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 45(2), pages 744-761, June.
    21. Abhari, Kaveh & McGuckin, Summer, 2023. "Limiting factors of open innovation organizations: A case of social product development and research agenda," Technovation, Elsevier, vol. 119(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Joshua Tasoff & Wenjie Zhang, 2022. "The Performance of Time-Preference and Risk-Preference Measures in Surveys," Management Science, INFORMS, vol. 68(2), pages 1149-1173, February.
    2. Binder, Carola Conces, 2022. "Time-of-day and day-of-week variations in Amazon Mechanical Turk survey responses," Journal of Macroeconomics, Elsevier, vol. 71(C).
    3. Ranganathan, Kavitha & Lejarraga, Tomás, 2021. "Elicitation of risk preferences through satisficing," Journal of Behavioral and Experimental Finance, Elsevier, vol. 32(C).
    4. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    5. Wladislaw Mill & Cornelius Schneider, 2023. "The Bright Side of Tax Evasion," CESifo Working Paper Series 10615, CESifo.
    6. Tamás Csermely & Alexander Rabas, 2016. "How to reveal people’s preferences: Comparing time consistency and predictive power of multiple price list risk elicitation methods," Journal of Risk and Uncertainty, Springer, vol. 53(2), pages 107-136, December.
    7. Giuseppe Attanasi & Nikolaos Georgantzís & Valentina Rotondi & Daria Vigani, 2018. "Lottery- and survey-based risk attitudes linked through a multichoice elicitation task," Theory and Decision, Springer, vol. 84(3), pages 341-372, May.
    8. Schwaiger, Rene & Hueber, Laura, 2021. "Do MTurkers exhibit myopic loss aversion?," Economics Letters, Elsevier, vol. 209(C).
    9. Christine Gaertner & Petra Steinorth, 2023. "On the correlation of self‐reported and behavioral risk attitude measures: The case of the General Risk Question and the Investment Game following Gneezy and Potters (1997)," Risk Management and Insurance Review, American Risk and Insurance Association, vol. 26(3), pages 367-392, October.
    10. Cherry, Todd L. & James, Alexander G. & Murphy, James, 2021. "The impact of public health messaging and personal experience on the acceptance of mask wearing during the COVID-19 pandemic," Journal of Economic Behavior & Organization, Elsevier, vol. 187(C), pages 415-430.
    11. Abel François & Sophie Panel & Laurent Weill, 2023. "Dictators’ facial characteristics and foreign direct investment," Post-Print hal-03969697, HAL.
    12. Galizzi, Matteo M. & Machado, Sara R. & Miniaci, Raffaele, 2016. "Temporal stability, cross-validity, and external validity of risk preferences measures: experimental evidence from a UK representative sample," LSE Research Online Documents on Economics 67554, London School of Economics and Political Science, LSE Library.
    13. Blaine G. Robbins, 2017. "Status, identity, and ability in the formation of trust," Rationality and Society, , vol. 29(4), pages 408-448, November.
    14. Gary Charness & Thomas Garcia & Theo Offerman & Marie Claire Villeval, 2020. "Do measures of risk attitude in the laboratory predict behavior under risk in and outside of the laboratory?," Journal of Risk and Uncertainty, Springer, vol. 60(2), pages 99-123, April.
    15. Daniel Montoya Herrera & Marc Willinger, 2025. "Are risk-tolerant individuals more trustful? A representative sample study," Post-Print hal-05234962, HAL.
    16. Florian Teschner & Henner Gimpel, 2018. "Crowd Labor Markets as Platform for Group Decision and Negotiation Research: A Comparison to Laboratory Experiments," Group Decision and Negotiation, Springer, vol. 27(2), pages 197-214, April.
    17. Goldzahl, Léontine, 2017. "Contributions of risk preference, time orientation and perceptions to breast cancer screening regularity," Social Science & Medicine, Elsevier, vol. 185(C), pages 147-157.
    18. Cantarella, Michele & Strozzi, Chiara, 2019. "Workers in the Crowd: The Labour Market Impact of the Online Platform Economy," IZA Discussion Papers 12327, Institute of Labor Economics (IZA).
    19. Azzam, Tarek & Harman, Elena, 2016. "Crowdsourcing for quantifying transcripts: An exploratory study," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 63-73.
    20. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;

    JEL classification:

    • C81 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Methodology for Collecting, Estimating, and Organizing Microeconomic Data; Data Access
    • C89 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Other
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C99 - Mathematical and Quantitative Methods - - Design of Experiments - - - Other

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pra:mprapa:88450. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Joachim Winter (email available below). General contact details of provider: https://edirc.repec.org/data/vfmunde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.