IDEAS home Printed from https://ideas.repec.org/a/aea/aecrev/v105y2015i5p462-66.html
   My bibliography  Save this article

Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?

Author

Listed:
  • Omar Al-Ubaydli
  • John A. List

Abstract

A commonly held view is that laboratory experiments provide researchers with more "control" than natural field experiments. This paper explores how natural field experiments can provide researchers with more control than laboratory experiments. While laboratory experiments provide researchers with a high degree of control in the environment which participants agree to be experimental subjects, when participants systematically opt out of laboratory experiments, the researcher's ability to manipulate certain variables is limited. In contrast, natural field experiments bypass the participation decision altogether due to their covertness, and they allow for a potentially more diverse participant pool within the market of interest.

Suggested Citation

  • Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
  • Handle: RePEc:aea:aecrev:v:105:y:2015:i:5:p:462-66
    Note: DOI: 10.1257/aer.p20151013
    as

    Download full text from publisher

    File URL: http://www.aeaweb.org/articles.php?doi=10.1257/aer.p20151013
    Download Restriction: no

    File URL: https://www.aeaweb.org/aer/ds/10505/P2015_1013_ds.zip
    Download Restriction: Access to full text is restricted to AEA members and institutional subscribers.
    ---><---

    References listed on IDEAS

    as
    1. Alan S. Gerber & Dean Karlan & Daniel Bergan, 2009. "Does the Media Matter? A Field Experiment Measuring the Effect of Newspapers on Voting Behavior and Political Opinions," American Economic Journal: Applied Economics, American Economic Association, vol. 1(2), pages 35-52, April.
    2. Smith, Vernon L & Walker, James M, 1993. "Monetary Rewards and Decision Cost in Experimental Economics," Economic Inquiry, Western Economic Association International, vol. 31(2), pages 245-261, April.
    3. Armin Falk & James J. Heckman, 2009. "Lab Experiments are a Major Source of Knowledge in the Social Sciences," Working Papers 200935, Geary Institute, University College Dublin.
    4. Marianne Bertrand & Sendhil Mullainathan, 2004. "Are Emily and Greg More Employable Than Lakisha and Jamal? A Field Experiment on Labor Market Discrimination," American Economic Review, American Economic Association, vol. 94(4), pages 991-1013, September.
    5. Frechette, Guillaume R. & Schotter, Andrew (ed.), 2015. "Handbook of Experimental Economic Methodology," OUP Catalogue, Oxford University Press, number 9780195328325, November.
    6. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    7. repec:feb:framed:0081 is not listed on IDEAS
    8. John A. List, 2004. "Testing Neoclassical Competitive Theory in Multilateral Decentralized Markets," Journal of Political Economy, University of Chicago Press, vol. 112(5), pages 1131-1156, October.
    9. Harrison, Glenn W. & Lau, Morten I. & Elisabet Rutström, E., 2009. "Risk attitudes, randomization to treatment, and self-selection into experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 70(3), pages 498-507, June.
    10. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    11. Uri Gneezy & John List & Michael Price, 2012. "Toward an Understanding of Why People Discriminate: Evidence from a Series of Natural Field Experiments," Natural Field Experiments 00592, The Field Experiments Website.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Gosnell, Greer & Metcalfe, Robert & List, John A, 2016. "A new approach to an age-old problem: solving externalities by incenting workers directly," LSE Research Online Documents on Economics 84331, London School of Economics and Political Science, LSE Library.
    2. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    3. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    4. Omar Al-Ubaydli & John A. List, 2016. "Field Experiments in Markets," NBER Working Papers 22113, National Bureau of Economic Research, Inc.
    5. Eric Floyd & John A. List, 2016. "Using Field Experiments in Accounting and Finance," Journal of Accounting Research, Wiley Blackwell, vol. 54(2), pages 437-475, May.
    6. Martin G. Kocher & David Schindler & Stefan T. Trautmann & Yilong Xu, 2019. "Risk, time pressure, and selection effects," Experimental Economics, Springer;Economic Science Association, vol. 22(1), pages 216-246, March.
    7. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    8. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2018. "What you don’t know... Can’t hurt you? A natural field experiment on relative performance feedback in higher education," Sciences Po publications info:hdl:2441/5r0qo9lp3v9, Sciences Po.
    9. Czibor, Eszter & Claussen, Jörg & van Praag, Mirjam, 2019. "Women in a men’s world: Risk taking in an online card game community," Journal of Economic Behavior & Organization, Elsevier, vol. 158(C), pages 62-89.
    10. John List, 2021. "2021 Summary Data of Artefactual Field Experiments Published on Fieldexperiments.com," Artefactual Field Experiments 00749, The Field Experiments Website.
    11. John List, 2022. "2021 Summary Data of Natural Field Experiments Published on Fieldexperiments.com," Natural Field Experiments 00747, The Field Experiments Website.
    12. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    13. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    14. Haghani, Milad & Sarvi, Majid, 2019. "Laboratory experimentation and simulation of discrete direction choices: Investigating hypothetical bias, decision-rule effect and external validity based on aggregate prediction measures," Transportation Research Part A: Policy and Practice, Elsevier, vol. 130(C), pages 134-157.
    15. Tova Levin & Steven Levitt & John List, 2015. "A Glimpse into the World of High Capacity Givers: Experimental Evidence from a University Capital Campaign," Natural Field Experiments 00409, The Field Experiments Website.
    16. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2019. "What You Don’t Know…Can’t Hurt You? A Natural Field Experiment on Relative Performance Feedback in Higher Education," Management Science, INFORMS, vol. 65(8), pages 3714-3736, August.
    17. Omar Al‐Ubaydli & John A. List & Dana Suskind, 2020. "2017 Klein Lecture: The Science Of Using Science: Toward An Understanding Of The Threats To Scalability," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 61(4), pages 1387-1409, November.
    18. Timothy N. Cason & Steven Y. Wu, 2019. "Subject Pools and Deception in Agricultural and Resource Economics Experiments," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 743-758, July.
    19. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2018. "What you don’t know... Can’t hurt you? A natural field experiment on relative performance feedback in higher education," Sciences Po publications info:hdl:2441/5r0qo9lp3v9, Sciences Po.
    20. John List, 2022. "Framed Field Experiments: 2021 Summary on Fieldexperiments.com," Framed Field Experiments 00752, The Field Experiments Website.
    21. Stefan Bechtold & Christoph Engel, 2017. "The Valuation of Moral Rights: A Field Experiment," Discussion Paper Series of the Max Planck Institute for Research on Collective Goods 2017_04, Max Planck Institute for Research on Collective Goods.
    22. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2018. "What you don’t know... Can’t hurt you? A natural field experiment on relative performance feedback in higher education," Sciences Po publications info:hdl:2441/5r0qo9lp3v9, Sciences Po.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    2. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    3. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    4. Omar Al-Ubaydli & John List, 2015. "Control in Experiments: A Simple Model," Artefactual Field Experiments 00397, The Field Experiments Website.
    5. Erik Snowberg & Leeat Yariv, 2018. "Testing the Waters: Behavior across Participant Pools," CESifo Working Paper Series 7136, CESifo.
    6. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    7. Brian Albrecht & Omar Al-Ubaydli & Peter Boettke, 2022. "Testing the Hayek hypothesis: Recent theoretical and experimental evidence," Artefactual Field Experiments 00759, The Field Experiments Website.
    8. Johannes Abeler & Daniele Nosenzo, 2015. "Self-selection into laboratory experiments: pro-social motives versus monetary incentives," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 195-214, June.
    9. Jeannette Brosig‐Koch & Burkhard Hehenkamp & Johanna Kokot, 2017. "The effects of competition on medical service provision," Health Economics, John Wiley & Sons, Ltd., vol. 26(S3), pages 6-20, December.
    10. Katherine Farrow & Gilles Grolleau & Naoufel Mzoughi, 2018. "What in the Word! The Scope for the Effect of Word Choice on Economic Behavior," Kyklos, Wiley Blackwell, vol. 71(4), pages 557-580, November.
    11. Brit Grosskopf & Graeme Pearce, 2020. "Do You Mind Me Paying Less? Measuring Other-Regarding Preferences in the Market for Taxis," Management Science, INFORMS, vol. 66(11), pages 5059-5074, November.
    12. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    13. List John A., 2007. "Field Experiments: A Bridge between Lab and Naturally Occurring Data," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(2), pages 1-47, April.
    14. Al-Ubaydli, Omar & Boettke, Peter, 2010. "Markets as economizers of information: Field experimental examination of the “Hayek Hypothesis”," MPRA Paper 27660, University Library of Munich, Germany.
    15. Stephan Muehlbacher & Andre Hartmann & Erich Kirchler & James Alm, 2022. "Declaring income versus declaring taxes in tax compliance experiments: Does the design of laboratory experiments affect the results?," Working Papers 2210, Tulane University, Department of Economics.
    16. Sven Grüner & Mira Lehberger & Norbert Hirschauer & Oliver Mußhoff, 2022. "How (un)informative are experiments with students for other social groups? A study of agricultural students and farmers," Australian Journal of Agricultural and Resource Economics, Australian Agricultural and Resource Economics Society, vol. 66(3), pages 471-504, July.
    17. John List, 2021. "2021 Summary Data of Artefactual Field Experiments Published on Fieldexperiments.com," Artefactual Field Experiments 00749, The Field Experiments Website.
    18. Francisco B. Galarza, 2017. "Trust and Trustworthiness in College: An Experimental Analysis," Working Papers 17-03, Centro de Investigación, Universidad del Pacífico.
    19. Andrea Albertazzi, 2022. "Individual cheating in the lab: a new measure and external validity," Theory and Decision, Springer, vol. 93(1), pages 37-67, July.
    20. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.

    More about this item

    JEL classification:

    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:aea:aecrev:v:105:y:2015:i:5:p:462-66. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/aeaaaea.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Michael P. Albert (email available below). General contact details of provider: https://edirc.repec.org/data/aeaaaea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.