IDEAS home Printed from https://ideas.repec.org/p/qld/uq2004/663.html
   My bibliography  Save this paper

Scientific Inference from Field and Laboratory Economic Experiments: Empirical Evidence

Author

Listed:
  • Jonathan H.W. Tan

    (Department of Economics, School of Social Sciences, Nanyang Technological University.)

  • Zhao Zichen

    (Department of Economics, School of Social Sciences, Nanyang Technological University.)

  • Daniel John Zizzo

    (School of Economics, University of Queensland, Brisbane, Australia)

Abstract

RField experiments can help improve scientific inference by providing access to diverse samples that are representative in terms of demographic backgrounds, and by availing the use of assets that relate directly to the economic problem of interest. We present a study comparing claims based on laboratory and field experiments in 520 publications in 2018 and 2019 at leading general and field journals in economics. Each paper is surveyed for their key claims and matches along the dimensions of profession, age, and gender of experimental subjects; country of experiment; and experimental asset in relation to which a claim is made. We find that, particularly in the realm of policy testing, field experiments are more likely to match the key claims than laboratory experiments. However, depending on the dimension, less than 20% or only up to around 65% of field experiments including natural field experiments achieve a match. Around four out of five field experiments fail to match in at least three out of the five dimensions. We conclude that the methodological challenge of generalizing results beyond what is within the domain of the experiments themselves also applies to many papers based on field experiments, given the claims being made. In addition, we find that publications by top 20 institutions authors or with experiments conducted in Caucasian-majority countries have a substantially higher likelihood of wide generalizations.

Suggested Citation

  • Jonathan H.W. Tan & Zhao Zichen & Daniel John Zizzo, 2023. "Scientific Inference from Field and Laboratory Economic Experiments: Empirical Evidence," Discussion Papers Series 663, School of Economics, University of Queensland, Australia.
  • Handle: RePEc:qld:uq2004:663
    as

    Download full text from publisher

    File URL: https://economics.uq.edu.au/files/45122/663.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Charles F. Manski, 2013. "Response to the Review of ‘Public Policy in an Uncertain World’," Economic Journal, Royal Economic Society, vol. 0, pages 412-415, August.
    2. Jordi Brandts & Gary Charness, 2011. "The strategy versus the direct-response method: a first survey of experimental comparisons," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 375-398, September.
    3. Carrera, Mariana & Royer, Heather & Stehr, Mark & Sydnor, Justin, 2018. "Can financial incentives help people trying to establish new habits? Experimental evidence with new gym members," Journal of Health Economics, Elsevier, vol. 58(C), pages 202-214.
    4. Crockett, Erin & Crockett, Sean, 2019. "Endowments and risky choice," Journal of Economic Behavior & Organization, Elsevier, vol. 159(C), pages 344-354.
    5. Oriana Bandiera & Iwan Barankay & Imran Rasul, 2005. "Social Preferences and the Response to Incentives: Evidence from Personnel Data," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 120(3), pages 917-962.
    6. John A. List, 2007. "On the Interpretation of Giving in Dictator Games," Journal of Political Economy, University of Chicago Press, vol. 115, pages 482-493.
    7. Armin Falk & James J. Heckman, 2009. "Lab Experiments are a Major Source of Knowledge in the Social Sciences," Working Papers 200935, Geary Institute, University College Dublin.
    8. Cilliers, Jacobus & Kasirye, Ibrahim & Leaver, Clare & Serneels, Pieter & Zeitlin, Andrew, 2018. "Pay for locally monitored performance? A welfare analysis for teacher attendance in Ugandan primary schools," Journal of Public Economics, Elsevier, vol. 167(C), pages 69-90.
    9. Davis, Douglas D. & Holt, Charles a., 1993. "Experimental economics: Methods, problems and promise," Estudios Económicos, El Colegio de México, Centro de Estudios Económicos, vol. 8(2), pages 179-212.
    10. Frey, Bruno S & Oberholzer-Gee, Felix, 1997. "The Cost of Price Incentives: An Empirical Analysis of Motivation Crowding-Out," American Economic Review, American Economic Association, vol. 87(4), pages 746-755, September.
    11. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    12. Aimone, Jason A. & North, Charles & Rentschler, Lucas, 2019. "Priming the jury by asking for Donations: An empirical and experimental study," Journal of Economic Behavior & Organization, Elsevier, vol. 160(C), pages 158-167.
    13. Natalia Candelo & Rachel T. A. Croson & Catherine Eckel, 2018. "Transmission of information within transnational social networks: a field experiment," Experimental Economics, Springer;Economic Science Association, vol. 21(4), pages 905-923, December.
    14. Büyükboyacı, Mürüvvet & Gürdal, Mehmet Y. & Kıbrıs, Arzu & Kıbrıs, Özgür, 2019. "An experimental study of the investment implications of bankruptcy laws," Journal of Economic Behavior & Organization, Elsevier, vol. 158(C), pages 607-629.
    15. Zizzo, Daniel John, 2013. "Claims and confounds in economic experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 93(C), pages 186-195.
    16. John A. List, 2006. "The Behavioralist Meets the Market: Measuring Social Preferences and Reputation Effects in Actual Transactions," Journal of Political Economy, University of Chicago Press, vol. 114(1), pages 1-37, February.
    17. Karthik Muralidharan & Abhijeet Singh & Alejandro J. Ganimian, 2019. "Disrupting Education? Experimental Evidence on Technology-Aided Instruction in India," American Economic Review, American Economic Association, vol. 109(4), pages 1426-1460, April.
    18. List John A., 2007. "Field Experiments: A Bridge between Lab and Naturally Occurring Data," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 5(2), pages 1-47, April.
    19. Laury, Susan K. & Taylor, Laura O., 2008. "Altruism spillovers: Are behaviors in context-free experiments predictive of altruism toward a naturally occurring public good," Journal of Economic Behavior & Organization, Elsevier, vol. 65(1), pages 9-29, January.
    20. Agnès Festré & Pierre Garrouste, 2015. "Theory And Evidence In Psychology And Economics About Motivation Crowding Out: A Possible Convergence?," Journal of Economic Surveys, Wiley Blackwell, vol. 29(2), pages 339-356, April.
    21. Yan Chen & Ming Jiang & Erin L. Krupka, 2019. "Hunger and the gender gap," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 885-917, December.
    22. John A. List, 2003. "Does Market Experience Eliminate Market Anomalies?," The Quarterly Journal of Economics, Oxford University Press, vol. 118(1), pages 41-71.
    23. Brian E. Roe & David R. Just, 2009. "Internal and External Validity in Economics Research: Tradeoffs between Experiments, Field Experiments, Natural Experiments, and Field Data," American Journal of Agricultural Economics, Agricultural and Applied Economics Association, vol. 91(5), pages 1266-1271.
    24. Adnan Q. Khan & Asim Ijaz Khwaja & Benjamin A. Olken, 2019. "Making Moves Matter: Experimental Evidence on Incentivizing Bureaucrats through Performance-Based Postings," American Economic Review, American Economic Association, vol. 109(1), pages 237-270, January.
    25. Ernesto Reuben & Sherry Xin Li & Sigrid Suetens & Andrej Svorenčík & Theodore Turocy & Vasileios Kotsidis, 2022. "Trends in the publication of experimental economics articles," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 8(1), pages 1-15, December.
    26. Alexander W. Cappelen & Ulrik H. Nielsen & Bertil Tungodden & Jean-Robert Tyran & Erik Wengström, 2016. "Fairness is intuitive," Experimental Economics, Springer;Economic Science Association, vol. 19(4), pages 727-740, December.
    27. Jonathan de Quidt & Johannes Haushofer & Christopher Roth, 2018. "Measuring and Bounding Experimenter Demand," American Economic Review, American Economic Association, vol. 108(11), pages 3266-3302, November.
    28. Glenn W. Harrison, 2013. "Field experiments and methodological intolerance," Journal of Economic Methodology, Taylor & Francis Journals, vol. 20(2), pages 103-117, June.
    29. Kareem Haggag & Devin G Pope & Kinsey B Bryant-Lees & Maarten W Bos, 2019. "Attribution Bias in Consumer Choice," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 86(5), pages 2136-2183.
    30. Belot, Michele & Duch, Raymond & Miller, Luis, 2015. "A comprehensive comparison of students and non-students in classic experimental games," Journal of Economic Behavior & Organization, Elsevier, vol. 113(C), pages 26-33.
    31. Arthur Schram, 2005. "Artificiality: The tension between internal and external validity in economic experiments," Journal of Economic Methodology, Taylor & Francis Journals, vol. 12(2), pages 225-237.
    32. Joseph Henrich & Steve J. Heine & Ara Norenzayan, 2010. "The Weirdest People in the World?," RatSWD Working Papers 139, German Data Forum (RatSWD).
    33. Manski, Charles F., 2013. "Public Policy in an Uncertain World: Analysis and Decisions," Economics Books, Harvard University Press, number 9780674066892, Spring.
    34. repec:feb:artefa:0087 is not listed on IDEAS
    35. John A. List, 2011. "Why Economists Should Conduct Field Experiments and 14 Tips for Pulling One Off," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 3-16, Summer.
    36. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    37. Uri Gneezy & Ernan Haruvy & Hadas Yafe, 2004. "The inefficiency of splitting the bill," Economic Journal, Royal Economic Society, vol. 114(495), pages 265-280, April.
    38. Thomas Bossuroy & Clara Delavallade, 2016. "Experiments, policy, and theory in development economics: a response to Glenn Harrison’s ‘field experiments and methodological intolerance’," Journal of Economic Methodology, Taylor & Francis Journals, vol. 23(2), pages 147-156, June.
    39. repec:feb:artefa:0110 is not listed on IDEAS
    40. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
    41. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    42. Joseph Henrich & Steven J. Heine & Ara Norenzayan, 2010. "Most people are not WEIRD," Nature, Nature, vol. 466(7302), pages 29-29, July.
    43. Timothy N. Cason & Charles R. Plott, 2014. "Misconceptions and Game Form Recognition: Challenges to Theories of Revealed Preference and Framing," Journal of Political Economy, University of Chicago Press, vol. 122(6), pages 1235-1270.
    44. Rachel Croson & Uri Gneezy, 2009. "Gender Differences in Preferences," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 448-474, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    4. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    5. John A. List, 2024. "Optimally generate policy-based evidence before scaling," Nature, Nature, vol. 626(7999), pages 491-499, February.
    6. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    7. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    8. John List, 2021. "2021 Summary Data of Artefactual Field Experiments Published on Fieldexperiments.com," Artefactual Field Experiments 00749, The Field Experiments Website.
    9. John List, 2022. "2021 Summary Data of Natural Field Experiments Published on Fieldexperiments.com," Natural Field Experiments 00747, The Field Experiments Website.
    10. John List, 2022. "Framed Field Experiments: 2021 Summary on Fieldexperiments.com," Framed Field Experiments 00752, The Field Experiments Website.
    11. Goeschl, Timo & Kettner, Sara Elisa & Lohse, Johannes & Schwieren, Christiane, 2020. "How much can we learn about voluntary climate action from behavior in public goods games?," Ecological Economics, Elsevier, vol. 171(C).
    12. Bouma, J.A. & Nguyen, Binh & van der Heijden, Eline & Dijk, J.J., 2018. "Analysing Group Contract Design Using a Lab and a Lab-in-the-Field Threshold Public Good Experiment," Discussion Paper 2018-049, Tilburg University, Center for Economic Research.
    13. Hallsworth, Michael & List, John A. & Metcalfe, Robert D. & Vlaev, Ivo, 2017. "The behavioralist as tax collector: Using natural field experiments to enhance tax compliance," Journal of Public Economics, Elsevier, vol. 148(C), pages 14-31.
    14. Timothy N. Cason & Steven Y. Wu, 2019. "Subject Pools and Deception in Agricultural and Resource Economics Experiments," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 743-758, July.
    15. Nicolas Jacquemet & Olivier L’Haridon & Isabelle Vialle, 2014. "Marché du travail, évaluation et économie expérimentale," Revue française d'économie, Presses de Sciences-Po, vol. 0(1), pages 189-226.
    16. John A. List, 2014. "Using Field Experiments to Change the Template of How We Teach Economics," The Journal of Economic Education, Taylor & Francis Journals, vol. 45(2), pages 81-89, June.
    17. Omar Al-Ubaydli & John List, 2012. "On the Generalizability of Experimental Results in Economics," Artefactual Field Experiments 00467, The Field Experiments Website.
    18. Crawford, Ian & Harris, Donna, 2018. "Social interactions and the influence of “extremists”," Journal of Economic Behavior & Organization, Elsevier, vol. 153(C), pages 238-266.
    19. repec:awi:wpaper:0595 is not listed on IDEAS
    20. Gruener, Sven & Lehberger, Mira & Hirschauer, Norbert & Mußhoff, Oliver, 2021. "How (un-)informative are experiments with “standard subjects” for other social groups? – The case of agricultural students and farmers," SocArXiv psda5, Center for Open Science.
    21. Nicolas Jacquemet & Olivier L’Haridon & Isabelle Vialle, 2014. "Marché du travail, évaluation et économie expérimentale," Revue française d'économie, Presses de Sciences-Po, vol. 0(1), pages 189-226.

    More about this item

    Keywords

    experimental economics; lab experiments; field experiments; validity.;
    All these keywords.

    JEL classification:

    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:qld:uq2004:663. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SOE IT (email available below). General contact details of provider: https://edirc.repec.org/data/decuqau.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.