IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/25848.html
   My bibliography  Save this paper

The Science of Using Science: Towards an Understanding of the Threats to Scaling Experiments

Author

Listed:
  • Omar Al-Ubaydli
  • John A. List
  • Dana Suskind

Abstract

Policymakers are increasingly turning to insights gained from the experimental method as a means of informing public policies. Whether—and to what extent—insights from a research study scale to the level of the broader public is, in many situations, based on blind faith. This scale-up problem can lead to a vast waste of resources, a missed opportunity to improve people’s lives, and a diminution in the public’s trust in the scientific method’s ability to contribute to policymaking. This study provides a theoretical lens to deepen our understanding of the science of how to use science. Through a simple model, we highlight three elements of the scale-up problem: (1) when does evidence become actionable (appropriate statistical inference); (2) properties of the population; and (3) properties of the situation. We argue that until these three areas are fully understood and recognized by researchers and policymakers, the threats to scalability will render any scaling exercise as particularly vulnerable. In this way, our work represents a challenge to empiricists to estimate the nature and extent of how important the various threats to scalability are in practice, and to implement those in their original research.

Suggested Citation

  • Omar Al-Ubaydli & John A. List & Dana Suskind, 2019. "The Science of Using Science: Towards an Understanding of the Threats to Scaling Experiments," NBER Working Papers 25848, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:25848
    Note: DEV ED PE TWP
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w25848.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    2. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    3. Roland Fryer & Steven Levitt & John List & Sally Sadoff, 2012. "Enhancing the Efficacy of Teacher Incentives through Loss Aversion: A Field Experiment," Framed Field Experiments 00591, The Field Experiments Website.
    4. John A. List, 2011. "The Market for Charitable Giving," Journal of Economic Perspectives, American Economic Association, vol. 25(2), pages 157-180, Spring.
    5. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    6. Andreas Lange & John A. List & Michael K. Price, 2007. "Using Lotteries To Finance Public Goods: Theory And Experimental Evidence," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 48(3), pages 901-927, August.
    7. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    8. Fuhai Hong & Tanjim Hossain & John A. List & Migiwa Tanaka, 2018. "Testing The Theory Of Multitasking: Evidence From A Natural Field Experiment In Chinese Factories," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 59(2), pages 511-536, May.
    9. Christopher Jepsen & Steven Rivkin, 2009. "Class Size Reduction and Student Achievement: The Potential Tradeoff between Teacher Quality and Class Size," Journal of Human Resources, University of Wisconsin Press, vol. 44(1).
    10. repec:feb:artefa:0087 is not listed on IDEAS
    11. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    12. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    13. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    14. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    15. repec:mpr:mprres:5039 is not listed on IDEAS
    16. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
    17. Michael J. Weiss & Howard S. Bloom & Thomas Brock, 2014. "A Conceptual Framework For Studying The Sources Of Variation In Program Effects," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 33(3), pages 778-808, June.
    18. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    19. repec:mpr:mprres:6762 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Citations

    Blog mentions

    As found by EconAcademics.org, the blog aggregator for Economics research:
    1. Policymaking Is Not a Science (Yet) (Ep. 405)
      by Stephen J. Dubner in Freakonomics on 2020-02-13 04:00:34
    2. Policymaking Is Not a Science (Yet) (Ep. 405 Rebroadcast)
      by Stephen J. Dubner in Freakonomics on 2021-03-25 03:00:27

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Agostinelli, Francesco & Avitabile, Ciro & Bobba, Matteo, 2021. "Enhancing Human Capital at Scale," TSE Working Papers 21-1196, Toulouse School of Economics (TSE).
    2. Andreas Löschel & Matthias Rodemeier & Madeline Werthschulte, 2020. "When Nudges Fail to Scale: Field Experimental Evidence from Goal Setting on Mobile Phones," CESifo Working Paper Series 8485, CESifo.
    3. Matilde Giaccherini & David Herberich & David Jimenez-Gomez & John List & Giovanni Ponti & Michael Price, 2020. "Are Economics and Psychology Complements in Household Technology Diffusion? Evidence from a Natural Field Experiment," Natural Field Experiments 00713, The Field Experiments Website.
    4. Agostinelli, Francesco & Avitabile, Ciro & Bobba, Matteo, 2021. "Enhancing Human Capital at Scale," TSE Working Papers 21-1196, Toulouse School of Economics (TSE).
    5. Albert Bravo-Biosca, 2020. "Experimental Innovation Policy," Innovation Policy and the Economy, University of Chicago Press, vol. 20(1), pages 191-232.
    6. Eliot Abrams & Jonathan Libgober & John A. List, 2020. "Research Registries: Facts, Myths, and Possible Improvements," NBER Working Papers 27250, National Bureau of Economic Research, Inc.
    7. Difang Huang & Zhengyang Bao, 2020. "Gender Differences in Reaction to Enforcement Mechanisms: A Large-Scale Natural Field Experiment," Monash Economics Working Papers 08-20, Monash University, Department of Economics.
    8. Matilde Giaccherini & David H. Herberich & David Jimenez-Gomez & John A. List & Giovanni Ponti & Michael K. Price, 2019. "The Behavioralist Goes Door-To-Door: Understanding Household Technological Diffusion Using a Theory-Driven Natural Field Experiment," NBER Working Papers 26173, National Bureau of Economic Research, Inc.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    3. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    4. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    5. Omar Al‐Ubaydli & John A. List & Dana Suskind, 2020. "2017 Klein Lecture: The Science Of Using Science: Toward An Understanding Of The Threats To Scalability," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 61(4), pages 1387-1409, November.
    6. Andor, Mark A. & Fels, Katja M. & Renz, Jan & Rzepka, Sylvi, 2018. "Do planning prompts increase educational success? Evidence from randomized controlled trials in MOOCs," Ruhr Economic Papers 790, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    7. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    8. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    9. Andor, Mark A. & Gerster, Andreas & Peters, Jörg & Schmidt, Christoph M., 2020. "Social Norms and Energy Conservation Beyond the US," Journal of Environmental Economics and Management, Elsevier, vol. 103(C).
    10. Jonathan M.V. Davis & Jonathan Guryan & Kelly Hallberg & Jens Ludwig, 2017. "The Economics of Scale-Up," NBER Working Papers 23925, National Bureau of Economic Research, Inc.
    11. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    12. Omar Al-Ubaydli & John List, 2013. "On the Generalizability of Experimental Results in Economics: With A Response To Camerer," Artefactual Field Experiments j0001, The Field Experiments Website.
    13. Matteo M. Galizzi & Daniel Navarro Martinez, 2015. "On the external validity of social-preference games: A systematic lab-field study," Economics Working Papers 1462, Department of Economics and Business, Universitat Pompeu Fabra.
    14. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," World Bank Research Observer, World Bank Group, vol. 33(1), pages 34-64.
    15. Gonzales Mariella & Gianmarco León-Ciliotta & Luis R. Martinez, 2018. "How effective are monetary incentives to vote? Evidence from a nationwide policy," Economics Working Papers 1667, Department of Economics and Business, Universitat Pompeu Fabra, revised Jul 2019.
    16. Jeremy Bowles & Horacio Larreguy, 2019. "Who Debates, Who Wins? At-Scale Experimental Evidence on Debate Participation in a Liberian Election," CID Working Papers 375, Center for International Development at Harvard University.
    17. John List, 2021. "The Voltage Effect in Behavioral Economics," Artefactual Field Experiments 00733, The Field Experiments Website.
    18. Omar Al-Ubaydli & John A. List, 2013. "On the Generalizability of Experimental Results in Economics: With a Response to Commentors," CESifo Working Paper Series 4543, CESifo.
    19. Ghazala Azmat & Manuel Bagues & Antonio Cabrales & Nagore Iriberri, 2018. "What you don't know...Can't hurt you?: A natural field experiment on relative performance feedback in higher education," Sciences Po publications info:hdl:2441/5fhe3c1k6b8, Sciences Po.
    20. Laura Abramovsky & Britta Augsburg & Melanie Lührmann & Francisco Oteiza & Juan Pablo Rud, 2018. "Community matters: heterogenous impacts of a sanitation intervention," IFS Working Papers W18/28, Institute for Fiscal Studies.

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C92 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Group Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D03 - Microeconomics - - General - - - Behavioral Microeconomics: Underlying Principles

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:25848. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.