IDEAS home Printed from https://ideas.repec.org/a/bla/jorssa/v183y2020i3p1145-1166.html
   My bibliography  Save this article

New statistical metrics for multisite replication projects

Author

Listed:
  • Maya B. Mathur
  • Tyler J. VanderWeele

Abstract

Increasingly, researchers are attempting to replicate published original studies by using large, multisite replication projects, at least 134 of which have been completed or are on going. These designs are promising to assess whether the original study is statistically consistent with the replications and to reassess the strength of evidence for the scientific effect of interest. However, existing analyses generally focus on single replications; when applied to multisite designs, they provide an incomplete view of aggregate evidence and can lead to misleading conclusions about replication success. We propose new statistical metrics representing firstly the probability that the original study's point estimate would be at least as extreme as it actually was, if in fact the original study were statistically consistent with the replications, and secondly the estimated proportion of population effects agreeing in direction with the original study. Generalized versions of the second metric enable consideration of only meaningfully strong population effects that agree in direction, or alternatively that disagree in direction, with the original study. These metrics apply when there are at least 10 replications (unless the heterogeneity estimate τ^=0, in which case the metrics apply regardless of the number of replications). The first metric assumes normal population effects but appears robust to violations in simulations; the second is distribution free. We provide R packages (Replicate and MetaUtility).

Suggested Citation

  • Maya B. Mathur & Tyler J. VanderWeele, 2020. "New statistical metrics for multisite replication projects," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(3), pages 1145-1166, June.
  • Handle: RePEc:bla:jorssa:v:183:y:2020:i:3:p:1145-1166
    DOI: 10.1111/rssa.12572
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssa.12572
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssa.12572?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. James J. Heckman & Jeffrey Smith & Nancy Clements, 1997. "Making The Most Out Of Programme Evaluations and Social Experiments: Accounting For Heterogeneity in Programme Impacts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 487-535.
    2. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    3. Lynch, John G. & Bradlow, Eric T. & Huber, Joel C. & Lehmann, Donald R., 2015. "Reflections on the replication corner: In praise of conceptual replications," International Journal of Research in Marketing, Elsevier, vol. 32(4), pages 333-342.
    4. Alexander Etz & Joachim Vandekerckhove, 2016. "A Bayesian Perspective on the Reproducibility Project: Psychology," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-12, February.
    5. Gavin B Stewart & Douglas G Altman & Lisa M Askie & Lelia Duley & Mark C Simmonds & Lesley A Stewart, 2012. "Statistical Analysis of Individual Participant Data Meta-Analyses: A Comparison of Methods and Recommendations for Practice," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-8, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mathur, Maya B & VanderWeele, Tyler, 2018. "Statistical methods for evidence synthesis," Thesis Commons kd6ja, Center for Open Science.
    2. Nemati, Mehdi & Penn, Jerrod, 2020. "The impact of information-based interventions on conservation behavior: A meta-analysis," Resource and Energy Economics, Elsevier, vol. 62(C).
    3. Ismaël Mourifié & Marc Henry & Romuald Méango, 2020. "Sharp Bounds and Testability of a Roy Model of STEM Major Choices," Journal of Political Economy, University of Chicago Press, vol. 128(8), pages 3220-3283.
    4. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    5. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    6. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    7. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    8. Yu Ding & Wayne S. DeSarbo & Dominique M. Hanssens & Kamel Jedidi & John G. Lynch & Donald R. Lehmann, 2020. "The past, present, and future of measurement and methods in marketing analysis," Marketing Letters, Springer, vol. 31(2), pages 175-186, September.
    9. Hoderlein, Stefan & White, Halbert, 2012. "Nonparametric identification in nonseparable panel data models with generalized fixed effects," Journal of Econometrics, Elsevier, vol. 168(2), pages 300-314.
    10. Gnangnon, Sèna Kimm, 2023. "The Least developed countries' TRIPS Waiver and the Strength of Intellectual Property Protection," EconStor Preprints 271537, ZBW - Leibniz Information Centre for Economics.
    11. Victor Chernozhukov & Iván Fernández‐Val & Blaise Melly, 2013. "Inference on Counterfactual Distributions," Econometrica, Econometric Society, vol. 81(6), pages 2205-2268, November.
    12. Manuel Arellano & Stéphane Bonhomme, 2017. "Quantile Selection Models With an Application to Understanding Changes in Wage Inequality," Econometrica, Econometric Society, vol. 85, pages 1-28, January.
    13. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    14. Pablo Lavado & Gonzalo Rivera, 2016. "Identifying Treatment Effects with Data Combination and Unobserved Heterogeneity," Working Papers 79, Peruvian Economic Association.
    15. Gabriella Conti & James J. Heckman & Rodrigo Pinto, 2016. "The Effects of Two Influential Early Childhood Interventions on Health and Healthy Behaviour," Economic Journal, Royal Economic Society, vol. 126(596), pages 28-65, October.
    16. Jiannan Lu & Peng Ding & Tirthankar Dasgupta, 2018. "Treatment Effects on Ordinal Outcomes: Causal Estimands and Sharp Bounds," Journal of Educational and Behavioral Statistics, , vol. 43(5), pages 540-567, October.
    17. Andrew Chesher & Adam M. Rosen, 2021. "Counterfactual Worlds," Annals of Economics and Statistics, GENES, issue 142, pages 311-335.
    18. Heckman, James, 2001. "Accounting for Heterogeneity, Diversity and General Equilibrium in Evaluating Social Programmes," Economic Journal, Royal Economic Society, vol. 111(475), pages 654-699, November.
    19. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    20. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssa:v:183:y:2020:i:3:p:1145-1166. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.