IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2206.10214.html
   My bibliography  Save this paper

The Effectiveness of Digital Interventions on COVID-19 Attitudes and Beliefs

Author

Listed:
  • Susan Athey
  • Kristen Grabarz
  • Michael Luca
  • Nils Wernerfelt

Abstract

During the course of the COVID-19 pandemic, a common strategy for public health organizations around the world has been to launch interventions via advertising campaigns on social media. Despite this ubiquity, little has been known about their average effectiveness. We conduct a large-scale program evaluation of campaigns from 174 public health organizations on Facebook and Instagram that collectively reached 2.1 billion individuals and cost around \$40 million. We report the results of 819 randomized experiments that measured the impact of these campaigns across standardized, survey-based outcomes. We find on average these campaigns are effective at influencing self-reported beliefs, shifting opinions close to 1% at baseline with a cost per influenced person of about \$3.41. There is further evidence that campaigns are especially effective at influencing users' knowledge of how to get vaccines. Our results represent, to the best of our knowledge, the largest set of online public health interventions analyzed to date.

Suggested Citation

  • Susan Athey & Kristen Grabarz & Michael Luca & Nils Wernerfelt, 2022. "The Effectiveness of Digital Interventions on COVID-19 Attitudes and Beliefs," Papers 2206.10214, arXiv.org.
  • Handle: RePEc:arx:papers:2206.10214
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2206.10214
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Shan Huang & Sinan Aral & Yu Jeffrey Hu & Erik Brynjolfsson, 2020. "Social Advertising Effectiveness Across Products: A Large-Scale Field Experiment," Marketing Science, INFORMS, vol. 39(6), pages 1142-1165, November.
    2. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    3. Brett R. Gordon & Florian Zettelmeyer & Neha Bhargava & Dan Chapsky, 2019. "A Comparison of Approaches to Advertising Measurement: Evidence from Big Field Experiments at Facebook," Marketing Science, INFORMS, vol. 38(2), pages 193-225, March.
    4. Avi Goldfarb & Catherine E. Tucker, 2011. "Privacy Regulation and Online Advertising," Management Science, INFORMS, vol. 57(1), pages 57-71, January.
    5. Sarah Moshary & Bradley T. Shapiro & Jihong Song, 2021. "How and When to Use the Political Cycle to Identify Advertising Effects," Marketing Science, INFORMS, vol. 40(2), pages 283-304, March.
    6. Dean Eckles & Brett R. Gordon & Garrett A. Johnson, 2018. "Field studies of psychologically targeted ads face threats to internal validity," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(23), pages 5254-5255, June.
    7. Thomas Blake & Chris Nosko & Steven Tadelis, 2015. "Consumer Heterogeneity and Paid Search Effectiveness: A Large‐Scale Field Experiment," Econometrica, Econometric Society, vol. 83, pages 155-174, January.
    8. Marcella Alsan & Amitabh Chandra & Kosali Simon, 2021. "The Great Unequalizer: Initial Health Effects of COVID-19 in the United States," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 25-46, Summer.
    9. Hummel, Dennis & Maedche, Alexander, 2019. "How effective is nudging? A quantitative review on the effect sizes and limits of empirical nudging studies," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 80(C), pages 47-58.
    10. Jay J. Van Bavel & Katherine Baicker & Paulo S. Boggio & Valerio Capraro & Aleksandra Cichocka & Mina Cikara & Molly J. Crockett & Alia J. Crum & Karen M. Douglas & James N. Druckman & John Drury & Oe, 2020. "Using social and behavioural science to support COVID-19 pandemic response," Nature Human Behaviour, Nature, vol. 4(5), pages 460-471, May.
    11. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881.
    12. Rachael Meager, 2019. "Understanding the Average Impact of Microcredit Expansions: A Bayesian Hierarchical Analysis of Seven Randomized Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 11(1), pages 57-91, January.
    13. Hengchen Dai & Silvia Saccardo & Maria A. Han & Lily Roh & Naveen Raja & Sitaram Vangala & Hardikkumar Modi & Shital Pandya & Michael Sloyan & Daniel M. Croymans, 2021. "Behavioural nudges increase COVID-19 vaccinations," Nature, Nature, vol. 597(7876), pages 404-409, September.
    14. Meager, Rachael, 2019. "Understanding the average impact of microcredit expansions: a Bayesian hierarchical analysis of seven randomized experiments," LSE Research Online Documents on Economics 88190, London School of Economics and Political Science, LSE Library.
    15. Randall A. Lewis & Justin M. Rao, 2015. "The Unfavorable Economics of Measuring the Returns to Advertising," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(4), pages 1941-1973.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jose Maria Barrero & Nicholas Bloom & Steven J. Davis, 2023. "Long Social Distancing," Journal of Labor Economics, University of Chicago Press, vol. 41(S1), pages 129-172.
    2. Matilde Giaccherini & Joanna Kopinska & Gabriele Rovigatti, 2022. "Vax Populi: The Social Costs of Online Vaccine Skepticism," CESifo Working Paper Series 10184, CESifo.
    3. He, Daixin & Lu, Fangwen & Yang, Jianan, 2023. "Impact of self- or social-regarding health messages: Experimental evidence based on antibiotics purchases," Journal of Development Economics, Elsevier, vol. 163(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Susan Athey & Kristen Grabarz & Michael Luca & Nils Wernerfelt, 2023. "Digital public health interventions at scale: The impact of social media advertising on beliefs and outcomes related to COVID vaccines," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 120(5), pages 2208110120-, January.
    2. Garrett Johnson & Julian Runge & Eric Seufert, 2022. "Privacy-Centric Digital Advertising: Implications for Research," Customer Needs and Solutions, Springer;Institute for Sustainable Innovation and Growth (iSIG), vol. 9(1), pages 49-54, June.
    3. Weijia Dai & Hyunjin Kim & Michael Luca, 2023. "Frontiers: Which Firms Gain from Digital Advertising? Evidence from a Field Experiment," Marketing Science, INFORMS, vol. 42(3), pages 429-439, May.
    4. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    5. Randall Lewis & Dan Nguyen, 2015. "Display advertising’s competitive spillovers to consumer search," Quantitative Marketing and Economics (QME), Springer, vol. 13(2), pages 93-115, June.
    6. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    7. Diane Pelly & Orla Doyle, 2022. "Nudging in the workplace: increasing participation in employee EDI wellness events," Working Papers 202208, Geary Institute, University College Dublin.
    8. Brett R. Gordon & Robert Moakler & Florian Zettelmeyer, 2023. "Predictive Incrementality by Experimentation (PIE) for Ad Measurement," Papers 2304.06828, arXiv.org.
    9. Pietro Emilio Spini, 2021. "Robustness, Heterogeneous Treatment Effects and Covariate Shifts," Papers 2112.09259, arXiv.org.
    10. Neckermann, Susanne & Turmunkh, Uyanga & van Dolder, Dennie & Wang, Tong V., 2022. "Nudging student participation in online evaluations of teaching: Evidence from a field experiment," European Economic Review, Elsevier, vol. 141(C).
    11. Brett R Gordon & Kinshuk Jerath & Zsolt Katona & Sridhar Narayanan & Jiwoong Shin & Kenneth C Wilbur, 2019. "Inefficiencies in Digital Advertising Markets," Papers 1912.09012, arXiv.org, revised Feb 2020.
    12. Ron Berman & Christophe Van den Bulte, 2022. "False Discovery in A/B Testing," Management Science, INFORMS, vol. 68(9), pages 6762-6782, September.
    13. Bo, Hao & Galiani, Sebastian, 2021. "Assessing external validity," Research in Economics, Elsevier, vol. 75(3), pages 274-285.
    14. Berman, Ron & Heller, Yuval, 2020. "Naive Analytics Equilibrium," MPRA Paper 103824, University Library of Munich, Germany.
    15. Paul Hunermund & Elias Bareinboim, 2019. "Causal Inference and Data Fusion in Econometrics," Papers 1912.09104, arXiv.org, revised Mar 2023.
    16. Andrews, Isaiah & Oster, Emily, 2019. "A simple approximation for evaluating external validity bias," Economics Letters, Elsevier, vol. 178(C), pages 58-62.
    17. George Z. Gui, 2020. "Combining Observational and Experimental Data to Improve Efficiency Using Imperfect Instruments," Papers 2010.05117, arXiv.org, revised Dec 2023.
    18. Hoffmann, Vivian & Rao, Vijayendra & Surendra, Vaishnavi & Datta, Upamanyu, 2021. "Relief from usury: Impact of a self-help group lending program in rural India," Journal of Development Economics, Elsevier, vol. 148(C).
    19. Evan Borkum & Paolo Abarcar & Laura Meyer & Matthew Spitzer, "undated". "Jordan Refugee Livelihoods Development Impact Bond Evaluation Framework," Mathematica Policy Research Reports 602dafe521fe4467854dcd45e, Mathematica Policy Research.
    20. Christina Uhl & Nadia Abou Nabout & Klaus Miller, 2020. "How Much Ad Viewability is Enough? The Effect of Display Ad Viewability on Advertising Effectiveness," Papers 2008.12132, arXiv.org.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2206.10214. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.