IDEAS home Printed from https://ideas.repec.org/a/inm/ormksc/v42y2023i4p768-793.html
   My bibliography  Save this article

Close Enough? A Large-Scale Exploration of Non-Experimental Approaches to Advertising Measurement

Author

Listed:
  • Brett R. Gordon

    (Kellogg School of Management, Northwestern University, Evanston, Illinois 60208)

  • Robert Moakler

    (Ads Research, Meta, Menlo Park, California 94025)

  • Florian Zettelmeyer

    (Kellogg School of Management, Northwestern University, Evanston, Illinois 60208; National Bureau of Economic Research, Cambridge, Massachusetts 02138)

Abstract

Despite their popularity, randomized controlled trials (RCTs) are not always available for the purposes of advertising measurement. Non-experimental data are thus required. However, Facebook and other ad platforms use complex and evolving processes to select ads for users. Therefore, successful non-experimental approaches need to “undo” this selection. We analyze 663 large-scale experiments at Facebook to investigate whether this is possible with the data typically logged at large ad platforms. With access to over 5,000 user-level features, these data are richer than what most advertisers or their measurement partners can access. We investigate how accurately two non-experimental methods—double/debiased machine learning (DML) and stratified propensity score matching (SPSM)—can recover the experimental effects. Although DML performs better than SPSM, neither method performs well, even using flexible deep learning models to implement the propensity and outcome models. The median RCT lifts are 29%, 18%, and 5% for the upper, middle, and lower funnel outcomes, respectively. Using DML (SPSM), the median lift by funnel is 83% (173%), 58% (176%), and 24% (64%), respectively, indicating significant relative measurement errors. We further characterize the circumstances under which each method performs comparatively better. Overall, despite having access to large-scale experiments and rich user-level data, we are unable to reliably estimate an ad campaign’s causal effect.

Suggested Citation

  • Brett R. Gordon & Robert Moakler & Florian Zettelmeyer, 2023. "Close Enough? A Large-Scale Exploration of Non-Experimental Approaches to Advertising Measurement," Marketing Science, INFORMS, vol. 42(4), pages 768-793, July.
  • Handle: RePEc:inm:ormksc:v:42:y:2023:i:4:p:768-793
    DOI: 10.1287/mksc.2022.1413
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/mksc.2022.1413
    Download Restriction: no

    File URL: https://libkey.io/10.1287/mksc.2022.1413?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. George Gui & Harikesh Nair & Fengshi Niu, 2021. "Auction Throttling and Causal Inference of Online Advertising Effects," Papers 2112.15155, arXiv.org, revised Feb 2022.
    2. Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney Newey & James Robins, 2018. "Double/debiased machine learning for treatment and structural parameters," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 1-68, February.
    3. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 605-654.
    4. Brett R. Gordon & Florian Zettelmeyer & Neha Bhargava & Dan Chapsky, 2019. "A Comparison of Approaches to Advertising Measurement: Evidence from Big Field Experiments at Facebook," Marketing Science, INFORMS, vol. 38(2), pages 193-225, March.
    5. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    6. Thomas Blake & Chris Nosko & Steven Tadelis, 2015. "Consumer Heterogeneity and Paid Search Effectiveness: A Large‐Scale Field Experiment," Econometrica, Econometric Society, vol. 83, pages 155-174, January.
    7. Du, Ruihuan & Zhong, Yu & Nair, Harikesh S. & Cui, Bo & Shou, Ruyang, 2019. "Causally Driven Incremental Multi Touch Attribution Using a Recurrent Neural Network," Research Papers 3761, Stanford University, Graduate School of Business.
    8. Rajeev H. Dehejia & Sadek Wahba, 2002. "Propensity Score-Matching Methods For Nonexperimental Causal Studies," The Review of Economics and Statistics, MIT Press, vol. 84(1), pages 151-161, February.
    9. Susan Athey & Guido W. Imbens, 2019. "Machine Learning Methods That Economists Should Know About," Annual Review of Economics, Annual Reviews, vol. 11(1), pages 685-725, August.
    10. Athey, Susan & Imbens, Guido W., 2019. "Machine Learning Methods Economists Should Know About," Research Papers 3776, Stanford University, Graduate School of Business.
    11. Caio Waisman & Harikesh S. Nair & Carlos Carrion, 2019. "Online Causal Inference for Advertising in Real-Time Bidding Auctions," Papers 1908.08600, arXiv.org, revised Feb 2024.
    12. Imbens, Guido W & Angrist, Joshua D, 1994. "Identification and Estimation of Local Average Treatment Effects," Econometrica, Econometric Society, vol. 62(2), pages 467-475, March.
    13. Guido W. Imbens, 2004. "Nonparametric Estimation of Average Treatment Effects Under Exogeneity: A Review," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 4-29, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Guy Aridor & Rafael Jiménez-Durán & Ro'ee Levy & Lena Song, 2024. "The Economics of Social Media," CESifo Working Paper Series 10934, CESifo.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Brett R. Gordon & Robert Moakler & Florian Zettelmeyer, 2022. "Close Enough? A Large-Scale Exploration of Non-Experimental Approaches to Advertising Measurement," Papers 2201.07055, arXiv.org, revised Oct 2022.
    2. Michael Lechner, 2023. "Causal Machine Learning and its use for public policy," Swiss Journal of Economics and Statistics, Springer;Swiss Society of Economics and Statistics, vol. 159(1), pages 1-15, December.
    3. Goller, Daniel & Lechner, Michael & Moczall, Andreas & Wolff, Joachim, 2020. "Does the estimation of the propensity score by machine learning improve matching estimation? The case of Germany's programmes for long term unemployed," Labour Economics, Elsevier, vol. 65(C).
    4. Brett R. Gordon & Robert Moakler & Florian Zettelmeyer, 2023. "Predictive Incrementality by Experimentation (PIE) for Ad Measurement," Papers 2304.06828, arXiv.org.
    5. Huber, Martin, 2019. "An introduction to flexible methods for policy evaluation," FSES Working Papers 504, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    6. Jones A.M & Rice N, 2009. "Econometric Evaluation of Health Policies," Health, Econometrics and Data Group (HEDG) Working Papers 09/09, HEDG, c/o Department of Economics, University of York.
    7. Brett R. Gordon & Florian Zettelmeyer & Neha Bhargava & Dan Chapsky, 2019. "A Comparison of Approaches to Advertising Measurement: Evidence from Big Field Experiments at Facebook," Marketing Science, INFORMS, vol. 38(2), pages 193-225, March.
    8. Zhexiao Lin & Fang Han, 2022. "On regression-adjusted imputation estimators of the average treatment effect," Papers 2212.05424, arXiv.org, revised Jan 2023.
    9. Dettmann, E. & Becker, C. & Schmeißer, C., 2011. "Distance functions for matching in small samples," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1942-1960, May.
    10. Sant’Anna, Pedro H.C. & Zhao, Jun, 2020. "Doubly robust difference-in-differences estimators," Journal of Econometrics, Elsevier, vol. 219(1), pages 101-122.
    11. Fatema, Naureen, 2019. "Can land title reduce low-intensity interhousehold conflict incidences and associated damages in eastern DRC?," World Development, Elsevier, vol. 123(C), pages 1-1.
    12. Huber Martin & Wüthrich Kaspar, 2019. "Local Average and Quantile Treatment Effects Under Endogeneity: A Review," Journal of Econometric Methods, De Gruyter, vol. 8(1), pages 1-27, January.
    13. Flores, Carlos A. & Mitnik, Oscar A., 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," IZA Discussion Papers 4451, Institute of Labor Economics (IZA).
    14. Naureen Fatema & Shahriar Kibriya, 2018. "Givers of great dinners know few enemies: The impact of household food sufficiency and food sharing on low intensity interhousehold and community conflict in Eastern Democratic Republic of Congo," HiCN Working Papers 267, Households in Conflict Network.
    15. Adewale H. Adenuga & Claire Jack & Austen Ashfield & Michael Wallace, 2021. "Assessing the Impact of Participatory Extension Programme Membership on Farm Business Performance in Northern Ireland," Agriculture, MDPI, vol. 11(10), pages 1-12, September.
    16. Chad D. Meyerhoefer & Muzhe Yang, 2011. "The Relationship between Food Assistance and Health: A Review of the Literature and Empirical Strategies for Identifying Program Effects," Applied Economic Perspectives and Policy, Agricultural and Applied Economics Association, vol. 33(3), pages 304-344.
    17. Flores, Carlos A. & Flores-Lagunes, Alfonso, 2009. "Identification and Estimation of Causal Mechanisms and Net Effects of a Treatment under Unconfoundedness," IZA Discussion Papers 4237, Institute of Labor Economics (IZA).
    18. Huber, Martin & Meier, Jonas & Wallimann, Hannes, 2022. "Business analytics meets artificial intelligence: Assessing the demand effects of discounts on Swiss train tickets," Transportation Research Part B: Methodological, Elsevier, vol. 163(C), pages 22-39.
    19. George Gui & Harikesh Nair & Fengshi Niu, 2021. "Auction Throttling and Causal Inference of Online Advertising Effects," Papers 2112.15155, arXiv.org, revised Feb 2022.
    20. Martin Huber & Jannis Kueck, 2022. "Testing the identification of causal effects in observational data," Papers 2203.15890, arXiv.org, revised Jun 2023.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:ormksc:v:42:y:2023:i:4:p:768-793. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.