IDEAS home Printed from https://ideas.repec.org/p/iza/izadps/dp10517.html
   My bibliography  Save this paper

Conducting Interactive Experiments Online

Author

Listed:
  • Arechar, Antonio A.

    () (Yale University)

  • Gächter, Simon

    () (University of Nottingham)

  • Molleman, Lucas

    () (Max Planck Institute for Human Development)

Abstract

Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most important challenge of online interactive experiments is participant dropout. We discuss measures for reducing dropout and show that, for our case study, dropouts are exogenous to the experiment. We conclude that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a valuable complement to laboratory studies.

Suggested Citation

  • Arechar, Antonio A. & Gächter, Simon & Molleman, Lucas, 2017. "Conducting Interactive Experiments Online," IZA Discussion Papers 10517, Institute for the Study of Labor (IZA).
  • Handle: RePEc:iza:izadps:dp10517
    as

    Download full text from publisher

    File URL: http://ftp.iza.org/dp10517.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Jérôme Hergueux & Nicolas Jacquemet, 2015. "Social preferences in the online laboratory: a randomized experiment," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 251-283, June.
    2. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, 2015. "The Average Laboratory Samples a Population of 7,300 Amazon Mechanical Turk Workers," Mathematica Policy Research Reports f97b669c7b3e4c2ab95c9f805, Mathematica Policy Research.
    3. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    4. Jenkins, Stephen P, 1995. "Easy Estimation Methods for Discrete-Time Duration Models," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 57(1), pages 129-138, February.
    5. Katrin Schmelz & Anthony Ziegelmeyer, 2015. "Social Distance and Control Aversion: Evidence from the Internet and the Laboratory," TWI Research Paper Series 100, Thurgauer Wirtschaftsinstitut, Universität Konstanz.
    6. Anderhub, Vital & Muller, Rudolf & Schmidt, Carsten, 2001. "Design and evaluation of an economic experiment via the Internet," Journal of Economic Behavior & Organization, Elsevier, vol. 46(2), pages 227-247, October.
    7. Chesney, Thomas & Chuah, Swee-Hoon & Hoffmann, Robert, 2009. "Virtual world experimentation: An exploratory study," Journal of Economic Behavior & Organization, Elsevier, vol. 72(1), pages 618-635, October.
    8. Jon Anderson & Stephen Burks & Jeffrey Carpenter & Lorenz Götte & Karsten Maurer & Daniele Nosenzo & Ruth Potter & Kim Rocha & Aldo Rustichini, 2013. "Self-selection and variations in the laboratory measurement of other-regarding preferences across subject pools: evidence from one college student and two adult samples," Experimental Economics, Springer;Economic Science Association, vol. 16(2), pages 170-189, June.
    9. Johannes Abeler & Daniele Nosenzo, 2015. "Self-selection into laboratory experiments: pro-social motives versus monetary incentives," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 195-214, June.
    10. Gächter, Simon & Herrmann, Benedikt, 2011. "The limits of self-governance when cooperators get punished: Experimental evidence from urban and rural Russia," European Economic Review, Elsevier, vol. 55(2), pages 193-210, February.
    11. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, 2015. "The average laboratory samples a population of 7,300 Amazon Mechanical Turk workers," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 10(5), pages 479-491, September.
    12. Jan Stoop & Charles N. Noussair & Daan van Soest, 2012. "From the Lab to the Field: Cooperation among Fishermen," Journal of Political Economy, University of Chicago Press, vol. 120(6), pages 1027-1056.
    13. Krupnikov, Yanna & Levine, Adam Seth, 2014. "Cross-Sample Comparisons and External Validity," Journal of Experimental Political Science, Cambridge University Press, vol. 1(01), pages 59-80, March.
    14. Blair Cleave & Nikos Nikiforakis & Robert Slonim, 2013. "Is there selection bias in laboratory experiments? The case of social and risk preferences," Experimental Economics, Springer;Economic Science Association, vol. 16(3), pages 372-382, September.
    15. Jesse Chandler & Gabriele Paolacci & Eyal Peer & Pam Mueller & Kate A. Ratliff, 2015. "Using Nonnaive Participants Can Reduce Effect Sizes," Mathematica Policy Research Reports bffac982a56e4cfba3659e74a, Mathematica Policy Research.
    16. Jeffrey Carpenter & Erika Seki, 2011. "Do Social Preferences Increase Productivity? Field Experimental Evidence From Fishermen In Toyama Bay," Economic Inquiry, Western Economic Association International, vol. 49(2), pages 612-630, April.
    17. Urs Fischbacher, 2007. "z-Tree: Zurich toolbox for ready-made economic experiments," Experimental Economics, Springer;Economic Science Association, vol. 10(2), pages 171-178, June.
    18. Gachter, Simon & Herrmann, Benedikt & Thoni, Christian, 2004. "Trust, voluntary cooperation, and socio-economic background: survey and experimental evidence," Journal of Economic Behavior & Organization, Elsevier, vol. 55(4), pages 505-531, December.
    19. Simon Gachter & Ernst Fehr, 2000. "Cooperation and Punishment in Public Goods Experiments," American Economic Review, American Economic Association, vol. 90(4), pages 980-994, September.
    20. John A. List, 2004. "Young, Selfish and Male: Field evidence of social preferences," Economic Journal, Royal Economic Society, vol. 114(492), pages 121-149, January.
    21. Michèle Belot & Raymond Duch & Luis Miller, 2010. "Who should be called to the lab? A comprehensive comparison of students and non-students in classic experimental games," Discussion Papers 2010001, University of Oxford, Nuffield College.
    22. Bock, Olaf & Baetge, Ingmar & Nicklisch, Andreas, 2014. "hroot: Hamburg Registration and Organization Online Tool," European Economic Review, Elsevier, vol. 71(C), pages 117-120.
    23. Michal Krawczyk, 2011. "What brings your subjects to the lab? A field experiment," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 482-489, November.
    24. Guillén, Pablo & Veszteg, Róbert F., 2012. "On “lab rats”," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 41(5), pages 714-720.
    25. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(03), pages 351-368, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. repec:spr:sochwe:v:50:y:2018:i:2:d:10.1007_s00355-017-1081-5 is not listed on IDEAS
    2. repec:spr:jesaex:v:3:y:2017:i:1:d:10.1007_s40881-017-0035-0 is not listed on IDEAS

    More about this item

    Keywords

    experimental methodology; behavioral research; punishment; internet experiments; Amazon Mechanical Turk; public goods game;

    JEL classification:

    • C71 - Mathematical and Quantitative Methods - - Game Theory and Bargaining Theory - - - Cooperative Games
    • C88 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Other Computer Software
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • D71 - Microeconomics - - Analysis of Collective Decision-Making - - - Social Choice; Clubs; Committees; Associations

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:iza:izadps:dp10517. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Mark Fallak). General contact details of provider: http://www.iza.org .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.