IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2602.20581.html

Using Prior Studies to Design Experiments: An Empirical Bayes Approach

Author

Listed:
  • Zhiheng You

Abstract

We develop an empirical Bayes framework for experimental design that leverages information from prior related studies. When a researcher has access to estimates from previous studies on similar parameters, they can use empirical Bayes to estimate an informative prior over the parameter of interest in the new study. We show how this prior can be incorporated into a decision-theoretic experimental design framework to choose optimal design. The approach is illustrated via propensity score designs in stratified randomized experiments. Our theoretical results show that the empirical Bayes design achieves oracle-optimal performance as the number of prior studies grows, and characterize the rate at which regret vanishes. To illustrate the approach, we present two empirical applications--oncology drug trials and the Tennessee Project STAR experiment. Our framework connects the Bayesian meta-analysis literature to experimental design and provides practical guidance for researchers seeking to design more efficient experiments.

Suggested Citation

  • Zhiheng You, 2026. "Using Prior Studies to Design Experiments: An Empirical Bayes Approach," Papers 2602.20581, arXiv.org.
  • Handle: RePEc:arx:papers:2602.20581
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2602.20581
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Tommaso Crosta & Dean Karlan & Finley Ong & Julius Rüschenpöhler & Christopher R. Udry, 2024. "Unconditional Cash Transfers: A Bayesian Meta-Analysis of Randomized Evaluations in Low and Middle Income Countries," NBER Working Papers 32779, National Bureau of Economic Research, Inc.
    2. Raj Chetty & John N. Friedman & Jonah E. Rockoff, 2014. "Measuring the Impacts of Teachers I: Evaluating Bias in Teacher Value-Added Estimates," American Economic Review, American Economic Association, vol. 104(9), pages 2593-2632, September.
    3. Heinz Schmidli & Sandro Gsteiger & Satrajit Roychoudhury & Anthony O'Hagan & David Spiegelhalter & Beat Neuenschwander, 2014. "Robust meta-analytic-predictive priors in clinical trials with historical control information," Biometrics, The International Biometric Society, vol. 70(4), pages 1023-1032, December.
    4. Annie Liang & Xiaosheng Mu & Vasilis Syrgkanis, 2022. "Dynamically Aggregating Diverse Information," Econometrica, Econometric Society, vol. 90(1), pages 47-80, January.
    5. Feng, Long & Dicker, Lee H., 2018. "Approximate nonparametric maximum likelihood for mixture models: A convex optimization approach to fitting arbitrary multivariate mixing distributions," Computational Statistics & Data Analysis, Elsevier, vol. 122(C), pages 80-91.
    6. Christopher Adjaho & Timothy Christensen, 2022. "Externally Valid Policy Choice," Papers 2205.05561, arXiv.org, revised Nov 2025.
    7. Krueger, Alan B & Whitmore, Diane M, 2001. "The Effect of Attending a Small Class in the Early Grades on College-Test Taking and Middle School Test Results: Evidence from Project STAR," Economic Journal, Royal Economic Society, vol. 111(468), pages 1-28, January.
    8. Frederico Finan & Demian Pouzo, 2021. "Reinforcing RCTs with Multiple Priors while Learning about External Validity," Papers 2112.09170, arXiv.org, revised Sep 2024.
    9. Raj Chetty & John N. Friedman & Jonah E. Rockoff, 2014. "Measuring the Impacts of Teachers II: Teacher Value-Added and Student Outcomes in Adulthood," American Economic Review, American Economic Association, vol. 104(9), pages 2633-2679, September.
    10. Raj Chetty & Nathaniel Hendren, 2018. "The Impacts of Neighborhoods on Intergenerational Mobility I: Childhood Exposure Effects," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 133(3), pages 1107-1162.
    11. A Stefano Caria & Grant Gordon & Maximilian Kasy & Simon Quinn & Soha Osman Shami & Alexander Teytelboym, 2024. "An Adaptive Targeted Field Experiment: Job Search Assistance for Refugees in Jordan," Journal of the European Economic Association, European Economic Association, vol. 22(2), pages 781-836.
    12. Maximilian Kasy & Anja Sautmann, 2021. "Adaptive Treatment Assignment in Experiments for Policy Choice," Econometrica, Econometric Society, vol. 89(1), pages 113-132, January.
    13. Jiaying Gu & Roger Koenker, 2017. "Rebayes: an R package for empirical bayes mixture methods," CeMMAP working papers 37/17, Institute for Fiscal Studies.
    14. Brian P. Hobbs & Bradley P. Carlin & Sumithra J. Mandrekar & Daniel J. Sargent, 2011. "Hierarchical Commensurate and Power Prior Models for Adaptive Incorporation of Historical Information in Clinical Trials," Biometrics, The International Biometric Society, vol. 67(3), pages 1047-1056, September.
    15. Konrad Menzel, 2023. "Transfer Estimates for Causal Effects across Heterogeneous Sites," Papers 2305.01435, arXiv.org, revised Oct 2025.
    16. Rachael Meager, 2019. "Understanding the Average Impact of Microcredit Expansions: A Bayesian Hierarchical Analysis of Seven Randomized Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 11(1), pages 57-91, January.
    17. Alan B. Krueger, 1999. "Experimental Estimates of Education Production Functions," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 114(2), pages 497-532.
    18. Meager, Rachael, 2019. "Understanding the average impact of microcredit expansions: a Bayesian hierarchical analysis of seven randomized experiments," LSE Research Online Documents on Economics 88190, London School of Economics and Political Science, LSE Library.
    19. Jiaying Gu & Roger Koenker, 2017. "Rebayes: an R package for empirical bayes mixture methods," CeMMAP working papers CWP37/17, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Walters, Christopher, 2024. "Empirical Bayes methods in labor economics," Handbook of Labor Economics,, Elsevier.
    2. Stephen Machin & Sandra McNally & Martina Viarengo, 2016. ""Teaching to teach" literacy," CEP Discussion Papers dp1425, Centre for Economic Performance, LSE.
    3. Maxime Fajeau & Julien Grenet & Emma Laveissière & Orane Leonetti, 2025. "Efficacité des politiques éducatives : sources et hypothèses de calcul," Institut des Politiques Publiques halshs-05458929, HAL.
    4. Cavanagh,Jack & Fliegner,Jasmin Claire & Kopper,Sarah & Sautmann,Anja, 2023. "A Metadata Schema for Data from Experiments in the Social Sciences," Policy Research Working Paper Series 10296, The World Bank.
    5. Juan C. Yamin, 2025. "Poverty Targeting with Imperfect Information," Papers 2506.18188, arXiv.org.
    6. Michael Gechter & Keisuke Hirano & Jean Lee & Mahreen Mahmud & Orville Mondal & Jonathan Morduch & Saravana Ravindran & Abu S. Shonchoy, 2024. "Selecting Experimental Sites for External Validity," Papers 2405.13241, arXiv.org.
    7. Stephen Machin & Sandra McNally & Martina Viarengo, 2018. "Changing How Literacy Is Taught: Evidence on Synthetic Phonics," American Economic Journal: Economic Policy, American Economic Association, vol. 10(2), pages 217-241, May.
    8. Taylor, Eric, 2014. "Spending more of the school day in math class: Evidence from a regression discontinuity in middle school," Journal of Public Economics, Elsevier, vol. 117(C), pages 162-181.
    9. Meager, Rachael, 2022. "Aggregating distributional treatment effects: a Bayesian hierarchical analysis of the microcredit literature," LSE Research Online Documents on Economics 115559, London School of Economics and Political Science, LSE Library.
    10. Holla,Alaka & Bendini,Maria Magdalena & Dinarte Diaz,Lelys Ileana & Trako,Iva, 2021. "Is Investment in Preprimary Education Too Low ? Lessons from (Quasi) ExperimentalEvidence across Countries," Policy Research Working Paper Series 9723, The World Bank.
    11. repec:osf:osfxxx:nwp8k_v1 is not listed on IDEAS
    12. A Stefano Caria & Grant Gordon & Maximilian Kasy & Simon Quinn & Soha Osman Shami & Alexander Teytelboym, 2024. "An Adaptive Targeted Field Experiment: Job Search Assistance for Refugees in Jordan," Journal of the European Economic Association, European Economic Association, vol. 22(2), pages 781-836.
    13. Duque, Valentina & Gilraine, Michael, 2022. "Coal use, air pollution, and student performance," Journal of Public Economics, Elsevier, vol. 213(C).
    14. Koedel, Cory & Mihaly, Kata & Rockoff, Jonah E., 2015. "Value-added modeling: A review," Economics of Education Review, Elsevier, vol. 47(C), pages 180-195.
    15. Michael Geruso & Timothy J. Layton & Jacob Wallace, 2023. "What Difference Does a Health Plan Make? Evidence from Random Plan Assignment in Medicaid," American Economic Journal: Applied Economics, American Economic Association, vol. 15(3), pages 341-379, July.
    16. Raffaella Giacomini & Sokbae Lee & Silvia Sarpietro, 2023. "Individual Shrinkage for Random Effects," Papers 2308.01596, arXiv.org, revised Jul 2025.
    17. Valentin Verdier, 2020. "Estimation and Inference for Linear Models with Two-Way Fixed Effects and Sparsely Matched Data," The Review of Economics and Statistics, MIT Press, vol. 102(1), pages 1-16, March.
    18. Johnston, Andrew C., 2021. "Preferences, Selection, and the Structure of Teacher Pay," IZA Discussion Papers 14831, IZA Network @ LISER.
    19. Kedagni, Desire & Krishna, Kala & Megalokonomou, Rigissa & Zhao, Yingyan, 2021. "Does class size matter? How, and at what cost?," European Economic Review, Elsevier, vol. 133(C).
    20. Samuel Berlinski & Norbert Schady, 2015. "Daycare Services: It’s All about Quality," Palgrave Macmillan Books, in: Samuel Berlinski & Norbert Schady (ed.), The Early Years, chapter 4, pages 91-119, Palgrave Macmillan.
    21. Patrick Kline & Christopher Walters, 2021. "Reasonable Doubt: Experimental Detection of Job‐Level Employment Discrimination," Econometrica, Econometric Society, vol. 89(2), pages 765-792, March.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2602.20581. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.