IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/20877.html
   My bibliography  Save this paper

Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model

Author

Listed:
  • Omar Al-Ubaydli
  • John A. List

Abstract

A commonly held view is that laboratory experiments provide researchers with more “control” than natural field experiments, and that this advantage is to be balanced against the disadvantage that laboratory experiments are less generalizable. This paper presents a simple model that explores circumstances under which natural field experiments provide researchers with more control than laboratory experiments afford. This stems from the covertness of natural field experiments: laboratory experiments provide researchers with a high degree of control in the environment which participants agree to be experimental subjects. When participants systematically opt out of laboratory experiments, the researcher’s ability to manipulate certain variables is limited. In contrast, natural field experiments bypass the participation decision altogether and allow for a potentially more diverse participant pool within the market of interest. We show one particular case where such selection is invaluable: when treatment effects interact with participant characteristics.

Suggested Citation

  • Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:20877
    Note: EEE PE
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w20877.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Colin F. Camerer, 1998. "Can Asset Markets Be Manipulated? A Field Experiment with Racetrack Betting," Journal of Political Economy, University of Chicago Press, vol. 106(3), pages 457-482, June.
    2. Smith, Vernon L & Walker, James M, 1993. "Monetary Rewards and Decision Cost in Experimental Economics," Economic Inquiry, Western Economic Association International, vol. 31(2), pages 245-261, April.
    3. Armin Falk & James J. Heckman, 2009. "Lab Experiments are a Major Source of Knowledge in the Social Sciences," Working Papers 200935, Geary Institute, University College Dublin.
    4. repec:feb:framed:0081 is not listed on IDEAS
    5. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    6. Castillo, Marco & Petrie, Ragan & Torero, Maximo & Vesterlund, Lise, 2013. "Gender differences in bargaining outcomes: A field experiment on discrimination," Journal of Public Economics, Elsevier, vol. 99(C), pages 35-48.
    7. John A. List, 2004. "Testing Neoclassical Competitive Theory in Multilateral Decentralized Markets," Journal of Political Economy, University of Chicago Press, vol. 112(5), pages 1131-1156, October.
    8. Uri Gneezy & John List & Michael K. Price, 2012. "Toward an Understanding of Why People Discriminate: Evidence from a Series of Natural Field Experiments," NBER Working Papers 17855, National Bureau of Economic Research, Inc.
    9. Glenn W. Harrison & Morten I. Lau & E. Elisabet Rutström, 2007. "Estimating Risk Attitudes in Denmark: A Field Experiment," Scandinavian Journal of Economics, Wiley Blackwell, vol. 109(2), pages 341-368, June.
    10. Armin Falk & James J. Heckman, 2009. "Lab Experiments are a Major Source of Knowledge in the Social Sciences," CESifo Working Paper Series 2894, CESifo.
    11. Omar Al-Ubaydli & John List, 2012. "On the Generalizability of Experimental Results in Economics," Artefactual Field Experiments 00467, The Field Experiments Website.
    12. Alan S. Gerber & Dean Karlan & Daniel Bergan, 2009. "Does the Media Matter? A Field Experiment Measuring the Effect of Newspapers on Voting Behavior and Political Opinions," American Economic Journal: Applied Economics, American Economic Association, vol. 1(2), pages 35-52, April.
    13. John A. List, 2003. "Does Market Experience Eliminate Market Anomalies?," The Quarterly Journal of Economics, Oxford University Press, vol. 118(1), pages 41-71.
    14. Blair Cleave & Nikos Nikiforakis & Robert Slonim, 2013. "Is there selection bias in laboratory experiments? The case of social and risk preferences," Experimental Economics, Springer;Economic Science Association, vol. 16(3), pages 372-382, September.
    15. Marianne Bertrand & Sendhil Mullainathan, 2004. "Are Emily and Greg More Employable Than Lakisha and Jamal? A Field Experiment on Labor Market Discrimination," American Economic Review, American Economic Association, vol. 94(4), pages 991-1013, September.
    16. Harrison, Glenn W. & Lau, Morten I. & Elisabet Rutström, E., 2009. "Risk attitudes, randomization to treatment, and self-selection into experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 70(3), pages 498-507, June.
    17. Armin Falk, 2007. "Gift Exchange in the Field," Econometrica, Econometric Society, vol. 75(5), pages 1501-1511, September.
    18. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    19. Colin Camerer, 1998. "Can asset markets be manipulated? A field experiment with racetrack betting," Natural Field Experiments 00222, The Field Experiments Website.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Omar Al-Ubaydli & John A. List, 2016. "Field Experiments in Markets," NBER Working Papers 22113, National Bureau of Economic Research, Inc.
    2. Martin G. Kocher & David Schindler & Stefan T. Trautmann & Yilong Xu, 2019. "Risk, time pressure, and selection effects," Experimental Economics, Springer;Economic Science Association, vol. 22(1), pages 216-246, March.
    3. Omar Al-Ubaydli & John A. List & Dana Suskind, 2019. "The Science of Using Science: Towards an Understanding of the Threats to Scaling Experiments," NBER Working Papers 25848, National Bureau of Economic Research, Inc.
    4. Gosnell, Greer & Metcalfe, Robert & List, John A, 2016. "A new approach to an age-old problem: solving externalities by incenting workers directly," LSE Research Online Documents on Economics 84331, London School of Economics and Political Science, LSE Library.
    5. Omar Al-Ubaydli & John List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Natural Field Experiments 00649, The Field Experiments Website.
    6. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2018. "What you don't know...Can't hurt you?: A natural field experiment on relative performance feedback in higher education," Sciences Po publications info:hdl:2441/5fhe3c1k6b8, Sciences Po.
    7. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    8. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    9. Timothy N. Cason & Steven Y. Wu, 2019. "Subject Pools and Deception in Agricultural and Resource Economics Experiments," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 743-758, July.
    10. Czibor, Eszter & Claussen, Jörg & van Praag, Mirjam, 2019. "Women in a men’s world: Risk taking in an online card game community," Journal of Economic Behavior & Organization, Elsevier, vol. 158(C), pages 62-89.
    11. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    12. Haghani, Milad & Sarvi, Majid, 2019. "Laboratory experimentation and simulation of discrete direction choices: Investigating hypothetical bias, decision-rule effect and external validity based on aggregate prediction measures," Transportation Research Part A: Policy and Practice, Elsevier, vol. 130(C), pages 134-157.
    13. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2019. "What You Don’t Know…Can’t Hurt You? A Natural Field Experiment on Relative Performance Feedback in Higher Education," Management Science, INFORMS, vol. 65(8), pages 3714-3736, August.
    14. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    2. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    3. Omar Al-Ubaydli & John List, 2015. "Control in Experiments: A Simple Model," Artefactual Field Experiments 00397, The Field Experiments Website.
    4. Matteo M. Galizzi & Daniel Navarro Martinez, 2015. "On the external validity of social-preference games: A systematic lab-field study," Economics Working Papers 1462, Department of Economics and Business, Universitat Pompeu Fabra.
    5. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    6. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    7. Jan Stoop, 2014. "From the lab to the field: envelopes, dictators and manners," Experimental Economics, Springer;Economic Science Association, vol. 17(2), pages 304-313, June.
    8. James Alm & Kim M. Bloomquist & Michael McKee, 2015. "On The External Validity Of Laboratory Tax Compliance Experiments," Economic Inquiry, Western Economic Association International, vol. 53(2), pages 1170-1186, April.
    9. Michel André Maréchal & Christian Thöni, 2019. "Hidden Persuaders: Do Small Gifts Lubricate Business Negotiations?," Management Science, INFORMS, vol. 65(8), pages 3877-3888, August.
    10. John A. List, 2014. "Using Field Experiments to Change the Template of How We Teach Economics," The Journal of Economic Education, Taylor & Francis Journals, vol. 45(2), pages 81-89, June.
    11. Omar Al-Ubaydli & John List, 2012. "On the Generalizability of Experimental Results in Economics," Artefactual Field Experiments 00467, The Field Experiments Website.
    12. Slonim, Robert & Wang, Carmen & Garbarino, Ellen & Merrett, Danielle, 2013. "Opting-in: Participation bias in economic experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 43-70.
    13. Goeschl, Timo & Kettner, Sara Elisa & Lohse, Johannes & Schwieren, Christiane, 2015. "What do we learn from public good games about voluntary climate action? Evidence from an artefactual field experiment," Working Papers 0595, University of Heidelberg, Department of Economics.
    14. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
    15. Olivier Armantier & Amadou Boly, 2013. "Comparing Corruption in the Laboratory and in the Field in Burkina Faso and in Canada," Economic Journal, Royal Economic Society, vol. 123(12), pages 1168-1187, December.
    16. Johannes Abeler & Daniele Nosenzo, 2015. "Self-selection into laboratory experiments: pro-social motives versus monetary incentives," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 195-214, June.
    17. List John A., 2007. "Field Experiments: A Bridge between Lab and Naturally Occurring Data," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(2), pages 1-47, April.
    18. Handberg, Øyvind Nystad & Angelsen, Arild, 2015. "Experimental tests of tropical forest conservation measures," Journal of Economic Behavior & Organization, Elsevier, vol. 118(C), pages 346-359.
    19. Al-Ubaydli, Omar & Boettke, Peter, 2010. "Markets as economizers of information: Field experimental examination of the “Hayek Hypothesis”," MPRA Paper 27660, University Library of Munich, Germany.
    20. Thomas S. Dee, 2014. "Stereotype Threat And The Student-Athlete," Economic Inquiry, Western Economic Association International, vol. 52(1), pages 173-182, January.

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C92 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Group Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:20877. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.