IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00670.html
   My bibliography  Save this paper

The science of using science: Towards an understanding of the threats to scaling experiments

Author

Listed:
  • Omar Al-Ubaydli
  • John List
  • Dana Suskind

Abstract

Policymakers are increasingly turning to insights gained from the experimental method as a means of informing public policies. Whether-and to what extent-insights from a research study scale to the level of the broader public is, in many situations, based on blind faith. This scale-up problem can lead to a vast waste of resources, a missed opportunity to improve people's lives, and a diminution in the public's trust in the scientific method's ability to contribute to policymaking. This study provides a theoretical lens to deepen our understanding of the science of how to use science. Through a simple model, we highlight three elements of the scale-up problem: (1) when does evidence become actionable (appropriate statistical inference); (2) properties of the population; and (3) properties of the situation. We argue that until these three areas are fully understood and recognized by researchers and policymakers, the threats to scalability will render any scaling exercise as particularly vulnerable. In this way, our work represents a challenge to empiricists to estimate the nature and extent of how important the various threats to scalability are in practice, and to implement those in their original research.

Suggested Citation

  • Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00670
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00670.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    2. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    3. Roland Fryer & Steven Levitt & John List & Sally Sadoff, 2012. "Enhancing the Efficacy of Teacher Incentives through Loss Aversion: A Field Experiment," Framed Field Experiments 00591, The Field Experiments Website.
    4. Andreas Lange & John A. List & Michael K. Price, 2007. "Using Lotteries To Finance Public Goods: Theory And Experimental Evidence," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 48(3), pages 901-927, August.
    5. repec:wly:iecrev:v:59:y:2018:i:2:p:511-536 is not listed on IDEAS
    6. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    7. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    8. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    9. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    10. Fuhai Hong & Tanjim Hossain & John A. List & Migiwa Tanaka, 2018. "Testing The Theory Of Multitasking: Evidence From A Natural Field Experiment In Chinese Factories," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 59(2), pages 511-536, May.
    11. repec:mpr:mprres:5039 is not listed on IDEAS
    12. Michael J. Weiss & Howard S. Bloom & Thomas Brock, 2014. "A Conceptual Framework For Studying The Sources Of Variation In Program Effects," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 33(3), pages 778-808, June.
    13. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    14. repec:mpr:mprres:6762 is not listed on IDEAS
    15. John A. List, 2011. "The Market for Charitable Giving," Journal of Economic Perspectives, American Economic Association, vol. 25(2), pages 157-180, Spring.
    16. repec:aea:jecper:v:31:y:2017:i:4:p:103-24 is not listed on IDEAS
    17. Fuhai Hong & Tanjim Hossain & John A. List & Migiwa Tanaka, 2018. "Testing The Theory Of Multitasking: Evidence From A Natural Field Experiment In Chinese Factories," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 59(2), pages 511-536, May.
    18. Christopher Jepsen & Steven Rivkin, 2009. "Class Size Reduction and Student Achievement: The Potential Tradeoff between Teacher Quality and Class Size," Journal of Human Resources, University of Wisconsin Press, vol. 44(1).
    19. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    20. repec:feb:artefa:0087 is not listed on IDEAS
    21. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    22. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Albert Bravo-Biosca, 2019. "Experimental Innovation Policy," SPRU Working Paper Series 2019-19, SPRU - Science Policy Research Unit, University of Sussex Business School.

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C92 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Group Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D03 - Microeconomics - - General - - - Behavioral Microeconomics: Underlying Principles

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00670. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Joe Seidel). General contact details of provider: http://www.fieldexperiments.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.