IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2112.09170.html
   My bibliography  Save this paper

Reinforcing RCTs with Multiple Priors while Learning about External Validity

Author

Listed:
  • Frederico Finan
  • Demian Pouzo

Abstract

This paper presents a framework for how to incorporate prior sources of information into the design of a sequential experiment. These sources can include previous experiments, expert opinions, or the experimenter's own introspection. We formalize this problem using a Bayesian approach that maps each source to a Bayesian model. These models are aggregated according to their associated posterior probabilities. We evaluate a broad class of policy rules according to three criteria: whether the experimenter learns the parameters of the payoff distributions, the probability that the experimenter chooses the wrong treatment when deciding to stop the experiment, and the average rewards. We show that our framework exhibits several nice finite sample theoretical guarantees, including robustness to any source that is not externally valid.

Suggested Citation

  • Frederico Finan & Demian Pouzo, 2021. "Reinforcing RCTs with Multiple Priors while Learning about External Validity," Papers 2112.09170, arXiv.org, revised Mar 2023.
  • Handle: RePEc:arx:papers:2112.09170
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2112.09170
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Ashley L. Buchanan & Michael G. Hudgens & Stephen R. Cole & Katie R. Mollan & Paul E. Sax & Eric S. Daar & Adaora A. Adimora & Joseph J. Eron & Michael J. Mugavero, 2018. "Generalizing evidence from randomized trials using inverse probability of sampling weights," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 181(4), pages 1193-1209, October.
    2. Rajeev Dehejia & Cristian Pop-Eleches & Cyrus Samii, 2021. "From Local to Global: External Validity in a Fertility Natural Experiment," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(1), pages 217-243, January.
    3. Karlan, Dean & List, John A., 2020. "How can Bill and Melinda Gates increase other people's donations to fund public goods?," Journal of Public Economics, Elsevier, vol. 191(C).
    4. Stefano DellaVigna & Nicholas Otis & Eva Vivalt, 2020. "Forecasting the Results of Experiments: Piloting an Elicitation Strategy," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 75-79, May.
    5. Stefano DellaVigna & Devin Pope, 2018. "Predicting Experimental Results: Who Knows What?," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2410-2456.
    6. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    7. Julia Chabrier & Sarah Cohodes & Philip Oreopoulos, 2016. "What Can We Learn from Charter School Lotteries?," Journal of Economic Perspectives, American Economic Association, vol. 30(3), pages 57-84, Summer.
    8. Amanda Kowalski, 2016. "Doing more when you're running LATE: Applying marginal treatment effect methods to examine treatment effect heterogeneity in experiments," Artefactual Field Experiments 00560, The Field Experiments Website.
    9. Maximilian Kasy & Anja Sautmann, 2021. "Adaptive Treatment Assignment in Experiments for Policy Choice," Econometrica, Econometric Society, vol. 89(1), pages 113-132, January.
    10. Epstein, Larry G. & Schneider, Martin, 2003. "Recursive multiple-priors," Journal of Economic Theory, Elsevier, vol. 113(1), pages 1-31, November.
    11. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    12. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    13. Dean Karlan & John A List, 2012. "How Can Bill and Melinda Gates Increase Other People’s Donations to Fund Public Goods?," Working Papers id:4880, eSocialSciences.
    14. Susan Athey & Guido W. Imbens, 2019. "Machine Learning Methods That Economists Should Know About," Annual Review of Economics, Annual Reviews, vol. 11(1), pages 685-725, August.
    15. Athey, Susan & Imbens, Guido W., 2019. "Machine Learning Methods Economists Should Know About," Research Papers 3776, Stanford University, Graduate School of Business.
    16. Abhijit Banerjee & Dean Karlan & Jonathan Zinman, 2015. "Six Randomized Evaluations of Microcredit: Introduction and Further Steps," American Economic Journal: Applied Economics, American Economic Association, vol. 7(1), pages 1-21, January.
    17. James Bisbee & Rajeev Dehejia & Cristian Pop-Eleches & Cyrus Samii, 2017. "Local Instruments, Global Extrapolation: External Validity of the Labor Supply-Fertility Local Average Treatment Effect," Journal of Labor Economics, University of Chicago Press, vol. 35(S1), pages 99-147.
    18. Elizabeth A. Stuart & Stephen R. Cole & Catherine P. Bradshaw & Philip J. Leaf, 2011. "The use of propensity scores to assess the generalizability of results from randomized trials," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 174(2), pages 369-386, April.
    19. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Esposito Acosta,Bruno Nicola & Sautmann,Anja, 2022. "Adaptive Experiments for Policy Choice : Phone Calls for Home Reading in Kenya," Policy Research Working Paper Series 10098, The World Bank.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xinkun Nie & Guido Imbens & Stefan Wager, 2021. "Covariate Balancing Sensitivity Analysis for Extrapolating Randomized Trials across Locations," Papers 2112.04723, arXiv.org.
    2. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    3. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    4. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    5. Isaiah Andrews & Emily Oster, 2017. "A Simple Approximation for Evaluating External Validity Bias," NBER Working Papers 23826, National Bureau of Economic Research, Inc.
    6. Rajeev Dehejia & Cristian Pop-Eleches & Cyrus Samii, 2021. "From Local to Global: External Validity in a Fertility Natural Experiment," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(1), pages 217-243, January.
    7. Andrews, Isaiah & Oster, Emily, 2019. "A simple approximation for evaluating external validity bias," Economics Letters, Elsevier, vol. 178(C), pages 58-62.
    8. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    9. Takuya Ishihara & Toru Kitagawa, 2021. "Evidence Aggregation for Treatment Choice," Papers 2108.06473, arXiv.org.
    10. Ashis Das & Jed Friedman & Eeshani Kandpal, 2018. "Does involvement of local NGOs enhance public service delivery? Cautionary evidence from a malaria‐prevention program in India," Health Economics, John Wiley & Sons, Ltd., vol. 27(1), pages 172-188, January.
    11. Gabriel Okasa & Kenneth A. Younge, 2022. "Sample Fit Reliability," Papers 2209.06631, arXiv.org.
    12. Daido Kido, 2022. "Distributionally Robust Policy Learning with Wasserstein Distance," Papers 2205.04637, arXiv.org, revised Aug 2022.
    13. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    14. Michael Lechner, 2023. "Causal Machine Learning and its use for public policy," Swiss Journal of Economics and Statistics, Springer;Swiss Society of Economics and Statistics, vol. 159(1), pages 1-15, December.
    15. Levin, Tova & Levitt, Steven D. & List, John A., 2023. "A Glimpse into the world of high capacity givers: Experimental evidence from a university capital campaign," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 644-658.
    16. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    17. Arthur Charpentier & Romuald Élie & Carl Remlinger, 2023. "Reinforcement Learning in Economics and Finance," Computational Economics, Springer;Society for Computational Economics, vol. 62(1), pages 425-462, June.
    18. Andor, Mark A. & Gerster, Andreas & Peters, Jörg, 2022. "Information campaigns for residential energy conservation," European Economic Review, Elsevier, vol. 144(C).
    19. Indranil Goswami & Indranil Goswami, 2020. "No Substitute for the Real Thing: The Importance of In-Context Field Experiments in Fundraising," Marketing Science, INFORMS, vol. 39(6), pages 1052-1070, November.
    20. David Fielding & Stephen Knowles & Ronald Peeters, 2022. "In search of competitive givers," Southern Economic Journal, John Wiley & Sons, vol. 88(4), pages 1517-1548, April.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2112.09170. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.