IDEAS home Printed from https://ideas.repec.org/p/feb/natura/00607.html
   My bibliography  Save this paper

What Can We Learn From Experiments? Understanding the Threats to the Scalability of Experimental Results

Author

Listed:
  • Omar Al-Ubaydli
  • John List
  • Dana Suskind

Abstract

Policymakers often consider interventions at the scale of the population, or some other large scale. One of the sources of information about the potential effects of such interventions is experimental studies conducted at a significantly smaller scale. A common occurrence is for the treatment effects detected in these small-scale studies to diminish substantially in size when applied at the larger scale that is of interest to policymakers. This paper provides an overview of the main reasons for a breakdown in scalability. Understanding the principal mechanisms represents a first step toward formulating countermeasures that promote scalability.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Omar Al-Ubaydli & John List & Dana Suskind, 2017. "What Can We Learn From Experiments? Understanding the Threats to the Scalability of Experimental Results," Natural Field Experiments 00607, The Field Experiments Website.
  • Handle: RePEc:feb:natura:00607
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00607.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Linda Babcock & George Loewenstein, 1997. "Explaining Bargaining Impasse: The Role of Self-Serving Biases," Journal of Economic Perspectives, American Economic Association, vol. 11(1), pages 109-126, Winter.
    2. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    3. Roland G. Fryer & Steven D. Levitt & John A. List, 2008. "Exploring the Impact of Financial Incentives on Stereotype Threat: Evidence from a Pilot Study," American Economic Review, American Economic Association, vol. 98(2), pages 370-375, May.
    4. Vernon L. Smith, 1962. "An Experimental Study of Competitive Market Behavior," Journal of Political Economy, University of Chicago Press, vol. 70, pages 322-322.
    5. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    6. Diane Paulsell & Sarah Avellar & Emily Sama Martin & Patricia Del Grosso, "undated". "Home Visiting Evidence of Effectiveness Review: Executive Summary," Mathematica Policy Research Reports 5254a2ab30e146ce900220dbc, Mathematica Policy Research.
    7. repec:mpr:mprres:6938 is not listed on IDEAS
    8. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
    9. Neal S Young, 2008. "Why Current Publication May Distort Science," Working Papers id:1757, eSocialSciences.
    10. Neal S Young & John P A Ioannidis & Omar Al-Ubaydli, 2008. "Why Current Publication Practices May Distort Science," PLOS Medicine, Public Library of Science, vol. 5(10), pages 1-5, October.
    11. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    12. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    2. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    3. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    4. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    5. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    6. Eric Floyd & John A. List, 2016. "Using Field Experiments in Accounting and Finance," Journal of Accounting Research, Wiley Blackwell, vol. 54(2), pages 437-475, May.
    7. Omar Al‐Ubaydli & John A. List & Dana Suskind, 2020. "2017 Klein Lecture: The Science Of Using Science: Toward An Understanding Of The Threats To Scalability," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 61(4), pages 1387-1409, November.
    8. Timothy N. Cason & Steven Y. Wu, 2019. "Subject Pools and Deception in Agricultural and Resource Economics Experiments," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 73(3), pages 743-758, July.
    9. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    10. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    11. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    12. Levin, Tova & Levitt, Steven D. & List, John A., 2023. "A Glimpse into the world of high capacity givers: Experimental evidence from a university capital campaign," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 644-658.
    13. John List, 2021. "2021 Summary Data of Artefactual Field Experiments Published on Fieldexperiments.com," Artefactual Field Experiments 00749, The Field Experiments Website.
    14. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    15. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2018. "What you don't know...Can't hurt you?: A natural field experiment on relative performance feedback in higher education," Sciences Po publications info:hdl:2441/5fhe3c1k6b8, Sciences Po.
    16. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2018. "What you don’t know... Can’t hurt you? A natural field experiment on relative performance feedback in higher education," Sciences Po publications info:hdl:2441/5r0qo9lp3v9, Sciences Po.
    17. Christian A. Vossler, 2016. "Chamberlin Meets Ciriacy-Wantrup: Using Insights from Experimental Economics to Inform Stated Preference Research," Canadian Journal of Agricultural Economics/Revue canadienne d'agroeconomie, Canadian Agricultural Economics Society/Societe canadienne d'agroeconomie, vol. 64(1), pages 33-48, March.
    18. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2019. "What You Don’t Know…Can’t Hurt You? A Natural Field Experiment on Relative Performance Feedback in Higher Education," Management Science, INFORMS, vol. 65(8), pages 3714-3736, August.
    19. repec:hal:spmain:info:hdl:2441/5r0qo9lp3v97hptv0tki570p06 is not listed on IDEAS
    20. Boomsma, Mirthe, 2021. "On the transition to a sustainable economy : Field experimental evidence on behavioral interventions," Other publications TiSEM a0a27602-10ed-4ab1-87a5-5, Tilburg University, School of Economics and Management.
    21. John List, 2022. "2021 Summary Data of Natural Field Experiments Published on Fieldexperiments.com," Natural Field Experiments 00747, The Field Experiments Website.

    More about this item

    JEL classification:

    • C21 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • D82 - Microeconomics - - Information, Knowledge, and Uncertainty - - - Asymmetric and Private Information; Mechanism Design

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:natura:00607. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: David Franks (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.