IDEAS home Printed from https://ideas.repec.org/a/wly/emetrp/v90y2022i1p81-116.html
   My bibliography  Save this article

RCTs to Scale: Comprehensive Evidence From Two Nudge Units

Author

Listed:
  • Stefano DellaVigna
  • Elizabeth Linos

Abstract

Nudge interventions have quickly expanded from academic studies to larger implementation in so‐called Nudge Units in governments. This provides an opportunity to compare interventions in research studies, versus at scale. We assemble a unique data set of 126 RCTs covering 23 million individuals, including all trials run by two of the largest Nudge Units in the United States. We compare these trials to a sample of nudge trials in academic journals from two recent meta‐analyses. In the Academic Journals papers, the average impact of a nudge is very large—an 8.7 percentage point take‐up effect, which is a 33.4% increase over the average control. In the Nudge Units sample, the average impact is still sizable and highly statistically significant, but smaller at 1.4 percentage points, an 8.0% increase. We document three dimensions which can account for the difference between these two estimates: (i) statistical power of the trials; (ii) characteristics of the interventions, such as topic area and behavioral channel; and (iii) selective publication. A meta‐analysis model incorporating these dimensions indicates that selective publication in the Academic Journals sample, exacerbated by low statistical power, explains about 70 percent of the difference in effect sizes between the two samples. Different nudge characteristics account for most of the residual difference.

Suggested Citation

  • Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
  • Handle: RePEc:wly:emetrp:v:90:y:2022:i:1:p:81-116
    DOI: 10.3982/ECTA18709
    as

    Download full text from publisher

    File URL: https://doi.org/10.3982/ECTA18709
    Download Restriction: no

    File URL: https://libkey.io/10.3982/ECTA18709?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Rajeev Dehejia & Cristian Pop-Eleches & Cyrus Samii, 2021. "From Local to Global: External Validity in a Fertility Natural Experiment," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(1), pages 217-243, January.
    2. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    3. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    4. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    5. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    6. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    7. Hummel, Dennis & Maedche, Alexander, 2019. "How effective is nudging? A quantitative review on the effect sizes and limits of empirical nudging studies," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 80(C), pages 47-58.
    8. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    9. David Card & Jochen Kluve & Andrea Weber, 2018. "What Works? A Meta Analysis of Recent Active Labor Market Program Evaluations," Journal of the European Economic Association, European Economic Association, vol. 16(3), pages 894-931.
    10. Hallsworth, Michael & List, John A. & Metcalfe, Robert D. & Vlaev, Ivo, 2017. "The behavioralist as tax collector: Using natural field experiments to enhance tax compliance," Journal of Public Economics, Elsevier, vol. 148(C), pages 14-31.
    11. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    12. Erin Todd Bronchetti & Thomas S. Dee & David B. Hufman & Ellen Magenheim, 2013. "When a Nudge Isn’t Enough: Defaults and Saving Among Low-Income Tax Filers," National Tax Journal, National Tax Association;National Tax Journal, vol. 66(3), pages 609-634, September.
    13. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    14. Rachael Meager, 2019. "Understanding the Average Impact of Microcredit Expansions: A Bayesian Hierarchical Analysis of Seven Randomized Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 11(1), pages 57-91, January.
    15. Meager, Rachael, 2019. "Understanding the average impact of microcredit expansions: a Bayesian hierarchical analysis of seven randomized experiments," LSE Research Online Documents on Economics 88190, London School of Economics and Political Science, LSE Library.
    16. Saurabh Bhargava & Dayanand Manoli, 2015. "Psychological Frictions and the Incomplete Take-Up of Social Benefits: Evidence from an IRS Field Experiment," American Economic Review, American Economic Association, vol. 105(11), pages 3489-3529, November.
    17. Stefano DellaVigna & Devin Pope, 2018. "What Motivates Effort? Evidence and Expert Forecasts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(2), pages 1029-1069.
    18. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    19. Bold, Tessa & Kimenyi, Mwangi & Mwabu, Germano & Ng’ang’a, Alice & Sandefur, Justin, 2018. "Experimental evidence on scaling up education reforms in Kenya," Journal of Public Economics, Elsevier, vol. 168(C), pages 1-20.
    20. Jachimowicz, Jon M. & Duncan, Shannon & Weber, Elke U. & Johnson, Eric J., 2019. "When and why defaults influence decisions: a meta-analysis of default effects," Behavioural Public Policy, Cambridge University Press, vol. 3(2), pages 159-186, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Andor, Mark A. & Gerster, Andreas & Peters, Jörg, 2022. "Information campaigns for residential energy conservation," European Economic Review, Elsevier, vol. 144(C).
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Takuya Ishihara & Toru Kitagawa, 2021. "Evidence Aggregation for Treatment Choice," Papers 2108.06473, arXiv.org.
    4. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    5. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    6. Kaiser, Tim & Lusardi, Annamaria & Menkhoff, Lukas & Urban, Carly, 2022. "Financial education affects financial knowledge and downstream behaviors," Journal of Financial Economics, Elsevier, vol. 145(2), pages 255-272.
    7. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    8. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    9. Ankel-Peters, Jörg & Schmidt, Christoph M., 2023. "Rural electrification, the credibility revolution, and the limits of evidence-based policy," Ruhr Economic Papers 1051, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    10. Andrews, Isaiah & Oster, Emily, 2019. "A simple approximation for evaluating external validity bias," Economics Letters, Elsevier, vol. 178(C), pages 58-62.
    11. Holla,Alaka & Bendini,Maria Magdalena & Dinarte Diaz,Lelys Ileana & Trako,Iva, 2021. "Is Investment in Preprimary Education Too Low ? Lessons from (Quasi) ExperimentalEvidence across Countries," Policy Research Working Paper Series 9723, The World Bank.
    12. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    13. Andrew Dustan & Stanislao Maldonado & Juan Manuel Hernandez-Agramonte, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale field experiments in Peru," Working Papers 136, Peruvian Economic Association.
    14. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    15. Mariella Gonzales & Gianmarco León-Ciliotta & Luis R. Martínez, 2022. "How Effective Are Monetary Incentives to Vote? Evidence from a Nationwide Policy," American Economic Journal: Applied Economics, American Economic Association, vol. 14(1), pages 293-326, January.
    16. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    17. Tomas Havranek & Zuzana Irsova & Lubica Laslopova & Olesia Zeynalova, 2020. "Skilled and Unskilled Labor Are Less Substitutable than Commonly Thought," Working Papers IES 2020/29, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    18. Jason T. Kerwin & Rebecca L. Thornton, 2021. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 251-264, May.
    19. Faraz Usmani & Marc Jeuland & Subhrendu K. Pattanayak, 2018. "NGOs and the effectiveness of interventions," WIDER Working Paper Series wp-2018-59, World Institute for Development Economic Research (UNU-WIDER).
    20. Corduneanu-Huci, Cristina & Dorsch, Michael T. & Maarek, Paul, 2021. "The politics of experimentation: Political competition and randomized controlled trials," Journal of Comparative Economics, Elsevier, vol. 49(1), pages 1-21.

    More about this item

    JEL classification:

    • D9 - Microeconomics - - Micro-Based Behavioral Economics
    • H53 - Public Economics - - National Government Expenditures and Related Policies - - - Government Expenditures and Welfare Programs
    • I38 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty - - - Government Programs; Provision and Effects of Welfare Programs

    Lists

    This item is featured on the following reading lists, Wikipedia, or ReplicationWiki pages:
    1. Meta-Research in Economics

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:emetrp:v:90:y:2022:i:1:p:81-116. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/essssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.