IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/25848.html
   My bibliography  Save this paper

The Science of Using Science: Towards an Understanding of the Threats to Scaling Experiments

Author

Listed:
  • Omar Al-Ubaydli
  • John A. List
  • Dana Suskind

Abstract

Policymakers are increasingly turning to insights gained from the experimental method as a means of informing public policies. Whether—and to what extent—insights from a research study scale to the level of the broader public is, in many situations, based on blind faith. This scale-up problem can lead to a vast waste of resources, a missed opportunity to improve people’s lives, and a diminution in the public’s trust in the scientific method’s ability to contribute to policymaking. This study provides a theoretical lens to deepen our understanding of the science of how to use science. Through a simple model, we highlight three elements of the scale-up problem: (1) when does evidence become actionable (appropriate statistical inference); (2) properties of the population; and (3) properties of the situation. We argue that until these three areas are fully understood and recognized by researchers and policymakers, the threats to scalability will render any scaling exercise as particularly vulnerable. In this way, our work represents a challenge to empiricists to estimate the nature and extent of how important the various threats to scalability are in practice, and to implement those in their original research.

Suggested Citation

  • Omar Al-Ubaydli & John A. List & Dana Suskind, 2019. "The Science of Using Science: Towards an Understanding of the Threats to Scaling Experiments," NBER Working Papers 25848, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:25848
    Note: DEV ED PE TWP
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w25848.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    2. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    3. Roland Fryer & Steven Levitt & John List & Sally Sadoff, 2012. "Enhancing the Efficacy of Teacher Incentives through Loss Aversion: A Field Experiment," Framed Field Experiments 00591, The Field Experiments Website.
    4. Andreas Lange & John A. List & Michael K. Price, 2007. "Using Lotteries To Finance Public Goods: Theory And Experimental Evidence," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 48(3), pages 901-927, August.
    5. Diane Paulsell & Toni Porter & Gretchen Kirby & Kimberly Boller & Emily Sama Martin & Andrew Burwick & Christine Ross & Carol Begnoche, "undated". "Supporting Quality in Home-Based Child Care Initiative: Design and Evaluation Options," Mathematica Policy Research Reports 3887af819cdc4b2e9f0e830c0, Mathematica Policy Research.
    6. Fuhai Hong & Tanjim Hossain & John A. List & Migiwa Tanaka, 2018. "Testing The Theory Of Multitasking: Evidence From A Natural Field Experiment In Chinese Factories," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 59(2), pages 511-536, May.
    7. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    8. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    9. Diane Paulsell & Toni Porter & Gretchen Kirby, 2010. "Supporting Quality in Home-Based Child Care," Mathematica Policy Research Reports eb5ff211bc8f467b9dcaad75b, Mathematica Policy Research.
    10. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    11. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    12. repec:mpr:mprres:5039 is not listed on IDEAS
    13. Neal S Young, 2008. "Why Current Publication May Distort Science," Working Papers id:1757, eSocialSciences.
    14. Michael J. Weiss & Howard S. Bloom & Thomas Brock, 2014. "A Conceptual Framework For Studying The Sources Of Variation In Program Effects," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 33(3), pages 778-808, June.
    15. Neal S Young & John P A Ioannidis & Omar Al-Ubaydli, 2008. "Why Current Publication Practices May Distort Science," PLOS Medicine, Public Library of Science, vol. 5(10), pages 1-5, October.
    16. Zacharias Maniadis & Fabio Tufano & John A. List, 2014. "One Swallow Doesn't Make a Summer: New Evidence on Anchoring Effects," American Economic Review, American Economic Association, vol. 104(1), pages 277-290, January.
    17. repec:mpr:mprres:6762 is not listed on IDEAS
    18. John A. List, 2011. "The Market for Charitable Giving," Journal of Economic Perspectives, American Economic Association, vol. 25(2), pages 157-180, Spring.
    19. repec:mpr:mprres:6759 is not listed on IDEAS
    20. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    21. Christopher Jepsen & Steven Rivkin, 2009. "Class Size Reduction and Student Achievement: The Potential Tradeoff between Teacher Quality and Class Size," Journal of Human Resources, University of Wisconsin Press, vol. 44(1).
    22. repec:feb:artefa:0087 is not listed on IDEAS
    23. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    24. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control Than Laboratory Experiments?," American Economic Review, American Economic Association, vol. 105(5), pages 462-466, May.
    25. Jonathan M.V. Davis & Jonathan Guryan & Kelly Hallberg & Jens Ludwig, 2017. "The Economics of Scale-Up," NBER Working Papers 23925, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Blog mentions

    As found by EconAcademics.org, the blog aggregator for Economics research:
    1. Policymaking Is Not a Science (Yet) (Ep. 405 Rebroadcast)
      by Stephen J. Dubner in Freakonomics on 2021-03-25 03:00:27
    2. Policymaking Is Not a Science (Yet) (Ep. 405)
      by Stephen J. Dubner in Freakonomics on 2020-02-13 04:00:34

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Agostinelli, Francesco & Avitabile, Ciro & Bobba, Matteo, 2021. "Enhancing Human Capital in Children: A Case Study on Scaling," TSE Working Papers 21-1196, Toulouse School of Economics (TSE), revised Oct 2023.
    2. Andreas Löschel & Matthias Rodemeier & Madeline Werthschulte, 2020. "When Nudges Fail to Scale: Field Experimental Evidence from Goal Setting on Mobile Phones," CESifo Working Paper Series 8485, CESifo.
    3. Maria Gabriela Farfan Betran & Genoni,Maria Eugenia & Rubalcava,Luis & Teruel,Graciela M. & Thomas,Duncan, 2022. "Scaling Up Oportunidades and ItsImpact on Child Nutrition," Policy Research Working Paper Series 10088, The World Bank.
    4. Gerhard Riener & Sebastian Schneider & Valentin Wagner, 2020. "Addressing Validity and Generalizability Concerns in Field Experiments," Discussion Paper Series of the Max Planck Institute for Research on Collective Goods 2020_16, Max Planck Institute for Research on Collective Goods.
    5. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    6. Agostinelli, Francesco & Avitabile, Ciro & Bobba, Matteo, 2021. "Enhancing Human Capital at Scale," IZA Discussion Papers 14192, Institute of Labor Economics (IZA).
    7. Matilde Giaccherini & David Herberich & David Jimenez-Gomez & John List & Giovanni Ponti & Michael Price, 2020. "Are Economics and Psychology Complements in Household Technology Diffusion? Evidence from a Natural Field Experiment," Natural Field Experiments 00713, The Field Experiments Website.
    8. John A. List, 2020. "Non est Disputandum de Generalizability? A Glimpse into The External Validity Trial," NBER Working Papers 27535, National Bureau of Economic Research, Inc.
    9. Altmann, Steffen & Glenny, Anita Marie & Mahlstedt, Robert & Sebald, Alexander, 2022. "The Direct and Indirect Effects of Online Job Search Advice," IZA Discussion Papers 15830, Institute of Labor Economics (IZA).
    10. Albert Bravo-Biosca, 2020. "Experimental Innovation Policy," Innovation Policy and the Economy, University of Chicago Press, vol. 20(1), pages 191-232.
    11. Eliot Abrams & Jonathan Libgober & John A. List, 2020. "Research Registries: Facts, Myths, and Possible Improvements," NBER Working Papers 27250, National Bureau of Economic Research, Inc.
    12. Difang Huang & Zhengyang Bao, 2020. "Gender Differences in Reaction to Enforcement Mechanisms: A Large-Scale Natural Field Experiment," Monash Economics Working Papers 08-20, Monash University, Department of Economics.
    13. Matilde Giaccherini & David H. Herberich & David Jimenez-Gomez & John A. List & Giovanni Ponti & Michael K. Price, 2019. "The Behavioralist Goes Door-To-Door: Understanding Household Technological Diffusion Using a Theory-Driven Natural Field Experiment," NBER Working Papers 26173, National Bureau of Economic Research, Inc.
    14. Meier, Johanna & Andor, Mark A. & Doebbe, Friederike C. & Haddaway, Neal R. & Reisch, Lucia A., 2022. "Review: Do green defaults reduce meat consumption?," Food Policy, Elsevier, vol. 110(C).
    15. Simone Busetti, 2023. "Causality is good for practice: policy design and reverse engineering," Policy Sciences, Springer;Society of Policy Sciences, vol. 56(2), pages 419-438, June.
    16. Stutzer, Alois, 2020. "Happiness and public policy: a procedural perspective," Behavioural Public Policy, Cambridge University Press, vol. 4(2), pages 210-225, July.
    17. Luca A. Panzone & Natasha Auch & Daniel John Zizzo, 2024. "Nudging the Food Basket Green: The Effects of Commitment and Badges on the Carbon Footprint of Food Shopping," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 87(1), pages 89-133, January.
    18. Hutchinson-Quillian, Jessan & Reiley, David & Samek, Anya, 2021. "Hassle costs and workplace charitable giving: Field experiments with Google employees," Journal of Economic Behavior & Organization, Elsevier, vol. 191(C), pages 679-685.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Omar Al‐Ubaydli & John A. List & Dana Suskind, 2020. "2017 Klein Lecture: The Science Of Using Science: Toward An Understanding Of The Threats To Scalability," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 61(4), pages 1387-1409, November.
    4. Omar Al-Ubaydli & John A. List & Dana L. Suskind, 2017. "What Can We Learn from Experiments? Understanding the Threats to the Scalability of Experimental Results," American Economic Review, American Economic Association, vol. 107(5), pages 282-286, May.
    5. John A. List, 2024. "Optimally generate policy-based evidence before scaling," Nature, Nature, vol. 626(7999), pages 491-499, February.
    6. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    7. John List, 2021. "2021 Summary Data of Artefactual Field Experiments Published on Fieldexperiments.com," Artefactual Field Experiments 00749, The Field Experiments Website.
    8. John List, 2022. "2021 Summary Data of Natural Field Experiments Published on Fieldexperiments.com," Natural Field Experiments 00747, The Field Experiments Website.
    9. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    10. Andor, Mark A. & Gerster, Andreas & Peters, Jörg & Schmidt, Christoph M., 2020. "Social Norms and Energy Conservation Beyond the US," Journal of Environmental Economics and Management, Elsevier, vol. 103(C).
    11. Omar Al-Ubaydli & Chien-Yu Lai & John A. List, 2023. "A Simple Rational Expectations Model of the Voltage Effect," NBER Working Papers 30850, National Bureau of Economic Research, Inc.
    12. John List, 2021. "The Voltage Effect in Behavioral Economics," Artefactual Field Experiments 00733, The Field Experiments Website.
    13. Robert Ammerman & Anne Duggan & John List & Lauren Supplee & Dana Suskind, 2021. "The role of open science practices in scaling evidence-based prevention programs," Natural Field Experiments 00741, The Field Experiments Website.
    14. John List, 2022. "Framed Field Experiments: 2021 Summary on Fieldexperiments.com," Framed Field Experiments 00752, The Field Experiments Website.
    15. Levin, Tova & Levitt, Steven D. & List, John A., 2023. "A Glimpse into the world of high capacity givers: Experimental evidence from a university capital campaign," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 644-658.
    16. Andrew Dustan & Stanislao Maldonado & Juan Manuel Hernandez-Agramonte, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale field experiments in Peru," Working Papers 136, Peruvian Economic Association.
    17. Mariella Gonzales & Gianmarco León-Ciliotta & Luis R. Martínez, 2022. "How Effective Are Monetary Incentives to Vote? Evidence from a Nationwide Policy," American Economic Journal: Applied Economics, American Economic Association, vol. 14(1), pages 293-326, January.
    18. Dustan, Andrew & Hernandez-Agramonte, Juan Manuel & Maldonado, Stanislao, 2023. "Motivating bureaucrats with behavioral insights when state capacity is weak: Evidence from large-scale field experiments in Peru," Journal of Development Economics, Elsevier, vol. 160(C).
    19. Omar Al-Ubaydli & John A. List & Danielle LoRe & Dana Suskind, 2017. "Scaling for Economists: Lessons from the Non-Adherence Problem in the Medical Literature," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 125-144, Fall.
    20. Monica P. Bhatt & Jonathan Guryan & Jens Ludwig & Anuj K. Shah, 2021. "Scope Challenges to Social Impact," NBER Working Papers 28406, National Bureau of Economic Research, Inc.

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C92 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Group Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D03 - Microeconomics - - General - - - Behavioral Microeconomics: Underlying Principles

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:25848. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.