IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/2bj85.html
   My bibliography  Save this paper

Meta-analyzing non-preregistered and preregistered studies

Author

Listed:
  • van Aert, Robbie Cornelis Maria

Abstract

Preregistration is gaining ground in psychology, and a consequence of this is that preregistered studies are more often included in meta-analyses. Preregistered studies mitigate the effect of publication bias in a meta-analysis, because preregistered studies can be located in the registries they were registered in even if they do not get published. However, current meta-analysis methods do not take into account that preregistered studies are less susceptible to publication bias. Traditional methods treat all studies as equivalent while meta-analytic conclusions can be improved by taking advantage of preregistered studies. The goal of this paper is to introduce a new method, the Hybrid Extended Meta-Analysis (HYEMA) method, that takes into account whether a study is preregistered or not to correct for publication bias in only the non-preregistered studies. The proposed method is applied to two meta-analyses on prominent effects in the psychological literature: the red-romance hypothesis and money priming. Applying HYEMA to these meta-analyses shows that the average effect size is substantially closer to zero than of the random-effects meta-analysis model. Two simulation studies tailored to the two applications are also presented to illustrate the method's superior performance compared to the random-effects meta-analysis model when publication bias is present. Hence, I recommend to always apply HYEMA as a sensitivity analysis if a mix of both preregistered and non-preregistered studies are present in a meta-analysis. Software is also developed and described in the paper to facilitate application of the method.

Suggested Citation

  • van Aert, Robbie Cornelis Maria, 2023. "Meta-analyzing non-preregistered and preregistered studies," MetaArXiv 2bj85, Center for Open Science.
  • Handle: RePEc:osf:metaar:2bj85
    DOI: 10.31219/osf.io/2bj85
    as

    Download full text from publisher

    File URL: https://osf.io/download/656076fb932b9f132e760892/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/2bj85?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    2. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    4. Shaw, Steven D. & Nave, Gideon, 2023. "Don't hate the player, hate the game: Realigning incentive structures to promote robust science and better scientific practices in marketing," Journal of Business Research, Elsevier, vol. 167(C).
    5. Moore, Don A. & Thau, Stefan & Zhong, Chenbo & Gino, Francesca, 2022. "Open Science at OBHDP," Organizational Behavior and Human Decision Processes, Elsevier, vol. 168(C).
    6. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    7. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArchive 2b5k4_v1, Center for Open Science.
    8. Brinkerink, Jasper & De Massis, Alfredo & Kellermanns, Franz, 2022. "One finding is no finding: Toward a replication culture in family business research," Journal of Family Business Strategy, Elsevier, vol. 13(4).
    9. Sébastien Duchêne & Adrien Nguyen-Huu & Dimitri Dubois & Marc Willinger, 2022. "Risk-return trade-offs in the context of environmental impact: a lab-in-the-field experiment with finance professionals," CEE-M Working Papers hal-03883121, CEE-M, Universtiy of Montpellier, CNRS, INRA, Montpellier SupAgro.
    10. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    11. Cantone, Giulio Giacomo, 2023. "The multiversal methodology as a remedy of the replication crisis," MetaArXiv kuhmz, Center for Open Science.
    12. Balafoutas, Loukas & Celse, Jeremy & Karakostas, Alexandros & Umashev, Nicholas, 2025. "Incentives and the replication crisis in social sciences: A critical review of open science practices," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 114(C).
    13. Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
    14. repec:osf:metaar:kuhmz_v1 is not listed on IDEAS
    15. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    16. Alexandru Marcoci & David P. Wilkinson & Ans Vercammen & Bonnie C. Wintle & Anna Lou Abatayo & Ernest Baskin & Henk Berkman & Erin M. Buchanan & Sara Capitán & Tabaré Capitán & Ginny Chan & Kent Jason, 2025. "Predicting the replicability of social and behavioural science claims in COVID-19 preprints," Nature Human Behaviour, Nature, vol. 9(2), pages 287-304, February.
    17. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    18. Tom Coupé & W. Robert Reed, 2021. "Do Negative Replications Affect Citations?," Working Papers in Economics 21/14, University of Canterbury, Department of Economics and Finance.
    19. Mueller-Langer, Frank & Andreoli-Versbach, Patrick, 2018. "Open access to research data: Strategic delay and the ambiguous welfare effects of mandatory data disclosure," Information Economics and Policy, Elsevier, vol. 42(C), pages 20-34.
    20. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    21. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:2bj85. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.