IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/27594.html
   My bibliography  Save this paper

RCTs to Scale: Comprehensive Evidence from Two Nudge Units

Author

Listed:
  • Stefano DellaVigna
  • Elizabeth Linos

Abstract

Nudge interventions have quickly expanded from academic studies to larger implementation in so-called Nudge Units in governments. This provides an opportunity to compare interventions in research studies, versus at scale. We assemble a unique data set of 126 RCTs covering over 23 million individuals, including all trials run by two of the largest Nudge Units in the United States. We compare these trials to a sample of nudge trials published in academic journals from two recent meta-analyses. In papers published in academic journals, the average impact of a nudge is very large – an 8.7 percentage point take-up effect, a 33.5% increase over the average control. In the Nudge Unit trials, the average impact is still sizable and highly statistically significant, but smaller at 1.4 percentage points, an 8.1% increase. We consider five potential channels for this gap: statistical power, selective publication, academic involvement, differences in trial features and in nudge features. Publication bias in the academic journals, exacerbated by low statistical power, can account for the full difference in effect sizes. Academic involvement does not account for the difference. Different features of the nudges, such as in-person versus letter-based communication, likely reflecting institutional constraints, can partially explain the different effect sizes. We conjecture that larger sample sizes and institutional constraints, which play an important role in our setting, are relevant in other at-scale implementations. Finally, we compare these results to the predictions of academics and practitioners. Most forecasters overestimate the impact for the Nudge Unit interventions, though nudge practitioners are almost perfectly calibrated.

Suggested Citation

  • Stefano DellaVigna & Elizabeth Linos, 2020. "RCTs to Scale: Comprehensive Evidence from Two Nudge Units," NBER Working Papers 27594, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:27594
    Note: DEV ED EH LE LS PE POL
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w27594.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    2. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. Hummel, Dennis & Maedche, Alexander, 2019. "How effective is nudging? A quantitative review on the effect sizes and limits of empirical nudging studies," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 80(C), pages 47-58.
    5. David Card & Jochen Kluve & Andrea Weber, 2018. "What Works? A Meta Analysis of Recent Active Labor Market Program Evaluations," Journal of the European Economic Association, European Economic Association, vol. 16(3), pages 894-931.
    6. Rajeev Dehejia & Cristian Pop-Eleches & Cyrus Samii, 2021. "From Local to Global: External Validity in a Fertility Natural Experiment," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(1), pages 217-243, January.
    7. Erin Todd Bronchetti & Thomas S. Dee & David B. Hufman & Ellen Magenheim, 2013. "When a Nudge Isn’t Enough: Defaults and Saving Among Low-Income Tax Filers," National Tax Journal, National Tax Association;National Tax Journal, vol. 66(3), pages 609-634, September.
    8. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    9. Rachael Meager, 2019. "Understanding the Average Impact of Microcredit Expansions: A Bayesian Hierarchical Analysis of Seven Randomized Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 11(1), pages 57-91, January.
    10. Saurabh Bhargava & Dayanand Manoli, 2015. "Psychological Frictions and the Incomplete Take-Up of Social Benefits: Evidence from an IRS Field Experiment," American Economic Review, American Economic Association, vol. 105(11), pages 3489-3529, November.
    11. Bold, Tessa & Kimenyi, Mwangi & Mwabu, Germano & Ng’ang’a, Alice & Sandefur, Justin, 2018. "Experimental evidence on scaling up education reforms in Kenya," Journal of Public Economics, Elsevier, vol. 168(C), pages 1-20.
    12. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    13. Stefano DellaVigna & Devin Pope, 2018. "What Motivates Effort? Evidence and Expert Forecasts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(2), pages 1029-1069.
    14. Jachimowicz, Jon M. & Duncan, Shannon & Weber, Elke U. & Johnson, Eric J., 2019. "When and why defaults influence decisions: a meta-analysis of default effects," Behavioural Public Policy, Cambridge University Press, vol. 3(2), pages 159-186, November.
    15. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    16. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    17. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    18. Hallsworth, Michael & List, John A. & Metcalfe, Robert D. & Vlaev, Ivo, 2017. "The behavioralist as tax collector: Using natural field experiments to enhance tax compliance," Journal of Public Economics, Elsevier, vol. 148(C), pages 14-31.
    19. Meager, Rachael, 2019. "Understanding the average impact of microcredit expansions: a Bayesian hierarchical analysis of seven randomized experiments," LSE Research Online Documents on Economics 88190, London School of Economics and Political Science, LSE Library.
    20. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Andor, Mark A. & Gerster, Andreas & Peters, Jörg, 2022. "Information campaigns for residential energy conservation," European Economic Review, Elsevier, vol. 144(C).
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Takuya Ishihara & Toru Kitagawa, 2021. "Evidence Aggregation for Treatment Choice," Papers 2108.06473, arXiv.org.
    4. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    5. Kaiser, Tim & Lusardi, Annamaria & Menkhoff, Lukas & Urban, Carly, 2022. "Financial education affects financial knowledge and downstream behaviors," Journal of Financial Economics, Elsevier, vol. 145(2), pages 255-272.
    6. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    7. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    8. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    9. Ankel-Peters, Jörg & Schmidt, Christoph M., 2023. "Rural electrification, the credibility revolution, and the limits of evidence-based policy," Ruhr Economic Papers 1051, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    10. Andrews, Isaiah & Oster, Emily, 2019. "A simple approximation for evaluating external validity bias," Economics Letters, Elsevier, vol. 178(C), pages 58-62.
    11. Holla,Alaka & Bendini,Maria Magdalena & Dinarte Diaz,Lelys Ileana & Trako,Iva, 2021. "Is Investment in Preprimary Education Too Low ? Lessons from (Quasi) ExperimentalEvidence across Countries," Policy Research Working Paper Series 9723, The World Bank.
    12. Andrew Dustan & Stanislao Maldonado & Juan Manuel Hernandez-Agramonte, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale field experiments in Peru," Working Papers 136, Peruvian Economic Association.
    13. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    14. Tomas Havranek & Zuzana Irsova & Lubica Laslopova & Olesia Zeynalova, 2020. "Skilled and Unskilled Labor Are Less Substitutable than Commonly Thought," Working Papers IES 2020/29, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    15. Faraz Usmani & Marc Jeuland & Subhrendu K. Pattanayak, 2018. "NGOs and the effectiveness of interventions," WIDER Working Paper Series wp-2018-59, World Institute for Development Economic Research (UNU-WIDER).
    16. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    17. Mariella Gonzales & Gianmarco León-Ciliotta & Luis R. Martínez, 2022. "How Effective Are Monetary Incentives to Vote? Evidence from a Nationwide Policy," American Economic Journal: Applied Economics, American Economic Association, vol. 14(1), pages 293-326, January.
    18. Jules Gazeaud & Claire Ricard, 2021. "Conditional cash transfers and the learning crisis: evidence from Tayssir scale-up in Morocco," NOVAFRICA Working Paper Series wp2102, Universidade Nova de Lisboa, Nova School of Business and Economics, NOVAFRICA.
    19. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    20. Stefano DellaVigna & Devin Pope, 2022. "Stability of Experimental Results: Forecasts and Evidence," American Economic Journal: Microeconomics, American Economic Association, vol. 14(3), pages 889-925, August.

    More about this item

    JEL classification:

    • D9 - Microeconomics - - Micro-Based Behavioral Economics
    • H53 - Public Economics - - National Government Expenditures and Related Policies - - - Government Expenditures and Welfare Programs
    • I38 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Government Programs; Provision and Effects of Welfare Programs

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:27594. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.