IDEAS home Printed from https://ideas.repec.org/
MyIDEAS: Log in (now much improved!) to save this paper

Conducting Interactive Experiments Online

Listed author(s):
  • Arechar, Antonio A.

    ()

    (Yale University)

  • Gächter, Simon

    ()

    (University of Nottingham)

  • Molleman, Lucas

    ()

    (Max Planck Institute for Human Development)

Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most important challenge of online interactive experiments is participant dropout. We discuss measures for reducing dropout and show that, for our case study, dropouts are exogenous to the experiment. We conclude that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a valuable complement to laboratory studies.

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.

File URL: http://ftp.iza.org/dp10517.pdf
Download Restriction: no

Paper provided by Institute for the Study of Labor (IZA) in its series IZA Discussion Papers with number 10517.

as
in new window

Length: 34 pages
Date of creation: Jan 2017
Handle: RePEc:iza:izadps:dp10517
Contact details of provider: Postal:
IZA, P.O. Box 7240, D-53072 Bonn, Germany

Phone: +49 228 3894 223
Fax: +49 228 3894 180
Web page: http://www.iza.org

Order Information: Postal: IZA, Margard Ody, P.O. Box 7240, D-53072 Bonn, Germany
Email:


References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:

as
in new window


  1. Jérôme Hergueux & Nicolas Jacquemet, 2015. "Social preferences in the online laboratory: a randomized experiment," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 251-283, June.
  2. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, 2015. "The Average Laboratory Samples a Population of 7,300 Amazon Mechanical Turk Workers," Mathematica Policy Research Reports f97b669c7b3e4c2ab95c9f805, Mathematica Policy Research.
  3. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
  4. Jenkins, Stephen P, 1995. "Easy Estimation Methods for Discrete-Time Duration Models," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 57(1), pages 129-138, February.
  5. Katrin Schmelz & Anthony Ziegelmeyer, 2015. "Social Distance and Control Aversion: Evidence from the Internet and the Laboratory," TWI Research Paper Series 100, Thurgauer Wirtschaftsinstitut, Universit�t Konstanz.
  6. Anderhub, Vital & Muller, Rudolf & Schmidt, Carsten, 2001. "Design and evaluation of an economic experiment via the Internet," Journal of Economic Behavior & Organization, Elsevier, vol. 46(2), pages 227-247, October.
  7. Chesney, Thomas & Chuah, Swee-Hoon & Hoffmann, Robert, 2009. "Virtual world experimentation: An exploratory study," Journal of Economic Behavior & Organization, Elsevier, vol. 72(1), pages 618-635, October.
  8. Jon Anderson & Stephen Burks & Jeffrey Carpenter & Lorenz Götte & Karsten Maurer & Daniele Nosenzo & Ruth Potter & Kim Rocha & Aldo Rustichini, 2013. "Self-selection and variations in the laboratory measurement of other-regarding preferences across subject pools: evidence from one college student and two adult samples," Experimental Economics, Springer;Economic Science Association, vol. 16(2), pages 170-189, June.
  9. Johannes Abeler & Daniele Nosenzo, 2015. "Self-selection into laboratory experiments: pro-social motives versus monetary incentives," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 195-214, June.
  10. Gächter, Simon & Herrmann, Benedikt, 2011. "The limits of self-governance when cooperators get punished: Experimental evidence from urban and rural Russia," European Economic Review, Elsevier, vol. 55(2), pages 193-210, February.
  11. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, 2015. "The average laboratory samples a population of 7,300 Amazon Mechanical Turk workers," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 10(5), pages 479-491, September.
  12. Jan Stoop & Charles N. Noussair & Daan van Soest, 2012. "From the Lab to the Field: Cooperation among Fishermen," Journal of Political Economy, University of Chicago Press, vol. 120(6), pages 1027-1056.
  13. Krupnikov, Yanna & Levine, Adam Seth, 2014. "Cross-Sample Comparisons and External Validity," Journal of Experimental Political Science, Cambridge University Press, vol. 1(01), pages 59-80, March.
  14. Blair Cleave & Nikos Nikiforakis & Robert Slonim, 2013. "Is there selection bias in laboratory experiments? The case of social and risk preferences," Experimental Economics, Springer;Economic Science Association, vol. 16(3), pages 372-382, September.
  15. Jesse Chandler & Gabriele Paolacci & Eyal Peer & Pam Mueller & Kate A. Ratliff, 2015. "Using Nonnaive Participants Can Reduce Effect Sizes," Mathematica Policy Research Reports bffac982a56e4cfba3659e74a, Mathematica Policy Research.
  16. Jeffrey Carpenter & Erika Seki, 2011. "Do Social Preferences Increase Productivity? Field Experimental Evidence From Fishermen In Toyama Bay," Economic Inquiry, Western Economic Association International, vol. 49(2), pages 612-630, 04.
  17. Urs Fischbacher, 2007. "z-Tree: Zurich toolbox for ready-made economic experiments," Experimental Economics, Springer;Economic Science Association, vol. 10(2), pages 171-178, June.
  18. Gachter, Simon & Herrmann, Benedikt & Thoni, Christian, 2004. "Trust, voluntary cooperation, and socio-economic background: survey and experimental evidence," Journal of Economic Behavior & Organization, Elsevier, vol. 55(4), pages 505-531, December.
  19. Simon Gachter & Ernst Fehr, 2000. "Cooperation and Punishment in Public Goods Experiments," American Economic Review, American Economic Association, vol. 90(4), pages 980-994, September.
  20. John A. List, 2004. "Young, Selfish and Male: Field evidence of social preferences," Economic Journal, Royal Economic Society, vol. 114(492), pages 121-149, 01.
  21. Michèle Belot & Raymond Duch & Luis Miller, 2010. "Who should be called to the lab? A comprehensive comparison of students and non-students in classic experimental games," Discussion Papers 2010001, University of Oxford, Nuffield College.
  22. Bock, Olaf & Baetge, Ingmar & Nicklisch, Andreas, 2014. "hroot: Hamburg Registration and Organization Online Tool," European Economic Review, Elsevier, vol. 71(C), pages 117-120.
  23. Michal Krawczyk, 2011. "What brings your subjects to the lab? A field experiment," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 482-489, November.
  24. Guillén, Pablo & Veszteg, Róbert F., 2012. "On “lab rats”," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 41(5), pages 714-720.
  25. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(03), pages 351-368, June.
Full references (including those not matched with items on IDEAS)

This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

When requesting a correction, please mention this item's handle: RePEc:iza:izadps:dp10517. See general information about how to correct material in RePEc.

For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Mark Fallak)

If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

If references are entirely missing, you can add them using this form.

If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

Please note that corrections may take a couple of weeks to filter through the various RePEc services.

This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.