IDEAS home Printed from https://ideas.repec.org/a/kap/expeco/v21y2018i1d10.1007_s10683-017-9527-2.html
   My bibliography  Save this article

Conducting interactive experiments online

Author

Listed:
  • Antonio A. Arechar

    () (Yale University)

  • Simon Gächter

    () (University of Nottingham
    University of Nottingham
    CESifo
    IZA)

  • Lucas Molleman

    () (University of Nottingham
    Max Planck Institute for Human Development)

Abstract

Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most important challenge of online interactive experiments is participant dropout. We discuss measures for reducing dropout and show that, for our case study, dropouts are exogenous to the experiment. We conclude that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a potentially valuable complement to laboratory studies.

Suggested Citation

  • Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
  • Handle: RePEc:kap:expeco:v:21:y:2018:i:1:d:10.1007_s10683-017-9527-2
    DOI: 10.1007/s10683-017-9527-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10683-017-9527-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Jérôme Hergueux & Nicolas Jacquemet, 2015. "Social preferences in the online laboratory: a randomized experiment," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 251-283, June.
    2. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    3. Jan Stoop, 2014. "From the lab to the field: envelopes, dictators and manners," Experimental Economics, Springer;Economic Science Association, vol. 17(2), pages 304-313, June.
    4. Chesney, Thomas & Chuah, Swee-Hoon & Hoffmann, Robert, 2009. "Virtual world experimentation: An exploratory study," Journal of Economic Behavior & Organization, Elsevier, vol. 72(1), pages 618-635, October.
    5. Gachter, Simon & Herrmann, Benedikt & Thoni, Christian, 2004. "Trust, voluntary cooperation, and socio-economic background: survey and experimental evidence," Journal of Economic Behavior & Organization, Elsevier, vol. 55(4), pages 505-531, December.
    6. Mullinix, Kevin J. & Leeper, Thomas J. & Druckman, James N. & Freese, Jeremy, 2015. "The Generalizability of Survey Experiments," Journal of Experimental Political Science, Cambridge University Press, vol. 2(2), pages 109-138, January.
    7. Jeffrey Carpenter & Erika Seki, 2011. "Do Social Preferences Increase Productivity? Field Experimental Evidence From Fishermen In Toyama Bay," Economic Inquiry, Western Economic Association International, vol. 49(2), pages 612-630, April.
    8. Simon Gachter & Ernst Fehr, 2000. "Cooperation and Punishment in Public Goods Experiments," American Economic Review, American Economic Association, vol. 90(4), pages 980-994, September.
    9. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, "undated". "The Average Laboratory Samples a Population of 7,300 Amazon Mechanical Turk Workers," Mathematica Policy Research Reports f97b669c7b3e4c2ab95c9f805, Mathematica Policy Research.
    10. Jenkins, Stephen P, 1995. "Easy Estimation Methods for Discrete-Time Duration Models," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 57(1), pages 129-138, February.
    11. Anderhub, Vital & Muller, Rudolf & Schmidt, Carsten, 2001. "Design and evaluation of an economic experiment via the Internet," Journal of Economic Behavior & Organization, Elsevier, vol. 46(2), pages 227-247, October.
    12. Katrin Schmelz & Anthony Ziegelmeyer, 2015. "Social Distance and Control Aversion: Evidence from the Internet and the Laboratory," TWI Research Paper Series 100, Thurgauer Wirtschaftsinstitut, Universität Konstanz.
    13. John A. List, 2004. "Young, Selfish and Male: Field evidence of social preferences," Economic Journal, Royal Economic Society, vol. 114(492), pages 121-149, January.
    14. Michèle Belot & Raymond Duch & Luis Miller, 2010. "Who should be called to the lab? A comprehensive comparison of students and non-students in classic experimental games," Discussion Papers 2010001, University of Oxford, Nuffield College.
    15. Jon Anderson & Stephen Burks & Jeffrey Carpenter & Lorenz Götte & Karsten Maurer & Daniele Nosenzo & Ruth Potter & Kim Rocha & Aldo Rustichini, 2013. "Self-selection and variations in the laboratory measurement of other-regarding preferences across subject pools: evidence from one college student and two adult samples," Experimental Economics, Springer;Economic Science Association, vol. 16(2), pages 170-189, June.
    16. Gächter, Simon & Herrmann, Benedikt, 2011. "The limits of self-governance when cooperators get punished: Experimental evidence from urban and rural Russia," European Economic Review, Elsevier, vol. 55(2), pages 193-210, February.
    17. Johannes Abeler & Daniele Nosenzo, 2015. "Self-selection into laboratory experiments: pro-social motives versus monetary incentives," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 195-214, June.
    18. Blair Cleave & Nikos Nikiforakis & Robert Slonim, 2013. "Is there selection bias in laboratory experiments? The case of social and risk preferences," Experimental Economics, Springer;Economic Science Association, vol. 16(3), pages 372-382, September.
    19. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, 2015. "The average laboratory samples a population of 7,300 Amazon Mechanical Turk workers," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 10(5), pages 479-491, September.
    20. Jan Stoop & Charles N. Noussair & Daan van Soest, 2012. "From the Lab to the Field: Cooperation among Fishermen," Journal of Political Economy, University of Chicago Press, vol. 120(6), pages 1027-1056.
    21. Gabriele Paolacci & Jesse Chandler & Panagiotis G. Ipeirotis, 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 5(5), pages 411-419, August.
    22. Bock, Olaf & Baetge, Ingmar & Nicklisch, Andreas, 2014. "hroot: Hamburg Registration and Organization Online Tool," European Economic Review, Elsevier, vol. 71(C), pages 117-120.
    23. Michal Krawczyk, 2011. "What brings your subjects to the lab? A field experiment," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 482-489, November.
    24. Guillén, Pablo & Veszteg, Róbert F., 2012. "On “lab rats”," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 41(5), pages 714-720.
    25. Krupnikov, Yanna & Levine, Adam Seth, 2014. "Cross-Sample Comparisons and External Validity," Journal of Experimental Political Science, Cambridge University Press, vol. 1(1), pages 59-80, April.
    26. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    27. Jesse Chandler & Gabriele Paolacci & Eyal Peer & Pam Mueller & Kate A. Ratliff, 2015. "Using Nonnaive Participants Can Reduce Effect Sizes," Mathematica Policy Research Reports bffac982a56e4cfba3659e74a, Mathematica Policy Research.
    28. Urs Fischbacher, 2007. "z-Tree: Zurich toolbox for ready-made economic experiments," Experimental Economics, Springer;Economic Science Association, vol. 10(2), pages 171-178, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    2. Matteo M. Galizzi & Daniel Navarro Martinez, 2015. "On the external validity of social-preference games: A systematic lab-field study," Economics Working Papers 1462, Department of Economics and Business, Universitat Pompeu Fabra.
    3. Schmidt, Robert & Schwieren, Christiane & Sproten, Alec N., 2020. "Norms in the lab: Inexperienced versus experienced participants," Journal of Economic Behavior & Organization, Elsevier, vol. 173(C), pages 239-255.
    4. Jon Anderson & Stephen Burks & Jeffrey Carpenter & Lorenz Götte & Karsten Maurer & Daniele Nosenzo & Ruth Potter & Kim Rocha & Aldo Rustichini, 2013. "Self-selection and variations in the laboratory measurement of other-regarding preferences across subject pools: evidence from one college student and two adult samples," Experimental Economics, Springer;Economic Science Association, vol. 16(2), pages 170-189, June.
    5. Johannes Abeler & Daniele Nosenzo, 2015. "Self-selection into laboratory experiments: pro-social motives versus monetary incentives," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 195-214, June.
    6. Stephen V. Burks & Daniele Nosenzo & Jon Anderson & Matthew Bombyk & Derek Ganzhorn & Lorenz Goette & Aldo Rustichini, 2015. "Lab Measures of Other-Regarding Preferences Can Predict Some Related on-the-Job Behavior: Evidence from a Large Scale Field Experiment," Discussion Papers 2015-21, The Centre for Decision Research and Experimental Economics, School of Economics, University of Nottingham.
    7. Abeler, Johannes & Nosenzo, Daniele, 2013. "Self-Selection into Economics Experiments Is Driven by Monetary Rewards," IZA Discussion Papers 7374, Institute of Labor Economics (IZA).
    8. Hans-Theo Normann & Till Requate & Israel Waichman, 2014. "Do short-term laboratory experiments provide valid descriptions of long-term economic interactions? A study of Cournot markets," Experimental Economics, Springer;Economic Science Association, vol. 17(3), pages 371-390, September.
    9. Dickinson, David L. & Masclet, David & Villeval, Marie Claire, 2015. "Norm enforcement in social dilemmas: An experiment with police commissioners," Journal of Public Economics, Elsevier, vol. 126(C), pages 74-85.
    10. Capraro, Valerio & Schulz, Jonathan & Rand, David G., 2019. "Time pressure and honesty in a deception game," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 79(C), pages 93-99.
    11. Frigau, Luca & Medda, Tiziana & Pelligra, Vittorio, 2019. "From the field to the lab. An experiment on the representativeness of standard laboratory subjects," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 78(C), pages 160-169.
    12. Crawford, Ian & Harris, Donna, 2018. "Social interactions and the influence of “extremists”," Journal of Economic Behavior & Organization, Elsevier, vol. 153(C), pages 238-266.
    13. David Ronayne & Daniel Sgroi, 2018. "On the motivations for the dual-use of electronic and traditional cigarettes," Applied Economics Letters, Taylor & Francis Journals, vol. 25(12), pages 830-834, July.
    14. Englmaier, Florian & Gebhardt, Georg, 2016. "Social dilemmas in the laboratory and in the field," Journal of Economic Behavior & Organization, Elsevier, vol. 128(C), pages 85-96.
    15. Christ, Margaret H. & Vance, Thomas W., 2018. "Cascading controls: The effects of managers’ incentives on subordinate effort to help or harm," Accounting, Organizations and Society, Elsevier, vol. 65(C), pages 20-32.
    16. Marcus Giamattei & Kyanoush Seyed Yahosseini & Simon Gächter & Lucas Molleman, 2020. "LIONESS Lab: a free web-based platform for conducting interactive experiments online," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(1), pages 95-111, June.
    17. Schmidt, Robert J. & Schwieren, Christiane & Sproten, Alec N., 2018. "Social Norm Perception in Economic Laboratory Experiments: Inexperienced versus Experienced Participants," Working Papers 0656, University of Heidelberg, Department of Economics.
    18. Hoffman, Mitchell & Morgan, John, 2015. "Who's naughty? Who's nice? Experiments on whether pro-social workers are selected out of cutthroat business environments," Journal of Economic Behavior & Organization, Elsevier, vol. 109(C), pages 173-187.
    19. Logan S. Casey & Jesse Chandler & Adam Seth Levine & Andrew Proctor & Dara Z. Strolovitch, 2017. "Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection," SAGE Open, , vol. 7(2), pages 21582440177, June.
    20. Frijters, Paul & Kong, Tao Sherry & Liu, Elaine M., 2015. "Who is coming to the artefactual field experiment? Participation bias among Chinese rural migrants," Journal of Economic Behavior & Organization, Elsevier, vol. 114(C), pages 62-74.

    More about this item

    Keywords

    Experimental methodology; Behavioral research; Internet experiments; Amazon Mechanical Turk; Public goods game; Punishment;
    All these keywords.

    JEL classification:

    • C71 - Mathematical and Quantitative Methods - - Game Theory and Bargaining Theory - - - Cooperative Games
    • C88 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Other Computer Software
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • D71 - Microeconomics - - Analysis of Collective Decision-Making - - - Social Choice; Clubs; Committees; Associations

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:kap:expeco:v:21:y:2018:i:1:d:10.1007_s10683-017-9527-2. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sonal Shukla) or (Springer Nature Abstracting and Indexing). General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.