IDEAS home Printed from https://ideas.repec.org/a/cup/pscirm/v7y2019i03p613-628_00.html
   My bibliography  Save this article

Generalizing from Survey Experiments Conducted on Mechanical Turk: A Replication Approach

Author

Listed:
  • Coppock, Alexander

Abstract

To what extent do survey experimental treatment effect estimates generalize to other populations and contexts? Survey experiments conducted on convenience samples have often been criticized on the grounds that subjects are sufficiently different from the public at large to render the results of such experiments uninformative more broadly. In the presence of moderate treatment effect heterogeneity, however, such concerns may be allayed. I provide evidence from a series of 15 replication experiments that results derived from convenience samples like Amazon’s Mechanical Turk are similar to those obtained from national samples. Either the treatments deployed in these experiments cause similar responses for many subject types or convenience and national samples do not differ much with respect to treatment effect moderators. Using evidence of limited within-experiment heterogeneity, I show that the former is likely to be the case. Despite a wide diversity of background characteristics across samples, the effects uncovered in these experiments appear to be relatively homogeneous.

Suggested Citation

  • Coppock, Alexander, 2019. "Generalizing from Survey Experiments Conducted on Mechanical Turk: A Replication Approach," Political Science Research and Methods, Cambridge University Press, vol. 7(3), pages 613-628, July.
  • Handle: RePEc:cup:pscirm:v:7:y:2019:i:03:p:613-628_00
    as

    Download full text from publisher

    File URL: https://www.cambridge.org/core/product/identifier/S2049847018000109/type/journal_article
    File Function: link to article abstract page
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    2. Lamberova, Natalia, 2021. "The puzzling politics of R&D: Signaling competence through risky projects," Journal of Comparative Economics, Elsevier, vol. 49(3), pages 801-818.
    3. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    4. Jerrod M. Penn & Daniel R. Petrolia & J. Matthew Fannin, 2023. "Hypothetical bias mitigation in representative and convenience samples," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 45(2), pages 721-743, June.
    5. Andreas Fügener & Jörn Grahl & Alok Gupta & Wolfgang Ketter, 2022. "Cognitive Challenges in Human–Artificial Intelligence Collaboration: Investigating the Path Toward Productive Delegation," Information Systems Research, INFORMS, vol. 33(2), pages 678-696, June.
    6. Accominotti, Fabien & Tadmon, Daniel, 2020. "How the reification of merit breeds inequality: theory and experimental evidence," LSE Research Online Documents on Economics 103865, London School of Economics and Political Science, LSE Library.
    7. Florian Pethig & Julia Kroenung, 2023. "Biased Humans, (Un)Biased Algorithms?," Journal of Business Ethics, Springer, vol. 183(3), pages 637-652, March.
    8. Kroll, Alexander & Vogel, Dominik, 2021. "Why Public Employees Manipulate Performance Data: Prosocial Impact, Job Stress, and Red Tape," SocArXiv eyjh3, Center for Open Science.
    9. Jesse Chandler & Jacob Hartog & Erin Lipman & Jonathan Gellar, "undated". "The Effect of School Report Card Design on Usability, Understanding, and Satisfaction," Mathematica Policy Research Reports 5cb96f706ee54791920e0a31e, Mathematica Policy Research.
    10. Ritwik Banerjee & Priyama Majumdar, 2023. "Exponential growth bias in the prediction of COVID‐19 spread and economic expectation," Economica, London School of Economics and Political Science, vol. 90(358), pages 653-689, April.
    11. Trisha R. Shrum, 2021. "The salience of future impacts and the willingness to pay for climate change mitigation: an experiment in intergenerational framing," Climatic Change, Springer, vol. 165(1), pages 1-20, March.
    12. Johannes G. Jaspersen & Marc A. Ragin & Justin R. Sydnor, 2022. "Insurance demand experiments: Comparing crowdworking to the lab," Journal of Risk & Insurance, The American Risk and Insurance Association, vol. 89(4), pages 1077-1107, December.
    13. Laura D. Scherer & Brian J. Zikmund-Fisher, 2020. "Eliciting Medical Maximizing-Minimizing Preferences with a Single Question: Development and Validation of the MM1," Medical Decision Making, , vol. 40(4), pages 545-550, May.
    14. Matthew Amengual & Rita Mota & Alexander Rustler, 2023. "The ‘Court of Public Opinion:’ Public Perceptions of Business Involvement in Human Rights Violations," Journal of Business Ethics, Springer, vol. 185(1), pages 49-74, June.
    15. Sean F. Ellis & Olesya M. Savchenko & Kent D. Messer, 2022. "Mitigating stigma associated with recycled water," American Journal of Agricultural Economics, John Wiley & Sons, vol. 104(3), pages 1077-1099, May.
    16. Beata Woźniak-Jęchorek, 2023. "Experiments in Modern Economics – Expansion and Technological and Institutional Innovations in the U.S," Ekonomista, Polskie Towarzystwo Ekonomiczne, issue 1, pages 78-101.
    17. Ulrich Thy Jensen, 2020. "Is self-reported social distancing susceptible to social desirability bias? Using the crosswise model to elicit sensitive behaviors," Journal of Behavioral Public Administration, Center for Experimental and Behavioral Public Administration, vol. 3(2).
    18. Zach Branson & Tirthankar Dasgupta, 2020. "Sampling‐based Randomised Designs for Causal Inference under the Potential Outcomes Framework," International Statistical Review, International Statistical Institute, vol. 88(1), pages 101-121, April.
    19. Wilkin, Carla & Ferreira, Aldónio & Rotaru, Kristian & Gaerlan, Luigi Red, 2020. "Big data prioritization in SCM decision-making: Its role and performance implications," International Journal of Accounting Information Systems, Elsevier, vol. 38(C).
    20. Farjam, Mike & Bravo, Giangiacomo, 2023. "Do you really believe that? The effect of economic incentives on the acceptance of real-world data in a polarized context," OSF Preprints sdmhw, Center for Open Science.
    21. Chinchanachokchai, Sydney & de Gregorio, Federico, 2020. "A consumer socialization approach to understanding advertising avoidance on social media," Journal of Business Research, Elsevier, vol. 110(C), pages 474-483.
    22. FabianG. Neuner, 2020. "Public Opinion and the Legitimacy of Global Private EnvironmentalGovernance," Global Environmental Politics, MIT Press, vol. 20(1), pages 60-81, February.
    23. Daniel L. Carlson & Richard J. Petts, 2022. "US Parents’ Domestic Labor During the First Year of the COVID-19 Pandemic," Population Research and Policy Review, Springer;Southern Demographic Association (SDA), vol. 41(6), pages 2393-2418, December.
    24. Mariken van der Velden & Felicia Loecherbach, 2021. "Epistemic Overconfidence in Algorithmic News Selection," Media and Communication, Cogitatio Press, vol. 9(4), pages 182-197.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cup:pscirm:v:7:y:2019:i:03:p:613-628_00. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Kirk Stebbing (email available below). General contact details of provider: https://www.cambridge.org/ram .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.