IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/kuhmz.html
   My bibliography  Save this paper

The multiversal methodology as a remedy of the replication crisis

Author

Listed:
  • Cantone, Giulio Giacomo

Abstract

This manuscript is a comprehensive historical and theoretical examination of the development of ‘multiversal methods’ as a response to the replication crisis. Multiversal methods are statistical procedures designed to assess the uncertainty arising from analyst-driven decisions in inferential models based on statistical regressions. The replication crisis is a surge in discovering that many studies fail to replicate the findings of previous studies. Replication crisis has raised concerns about the reliability and credibility of scientific research, particularly in social sciences and medicine. Section I provides a non-technical overview of the design of causal inference based on statistical regressions. Furtherly, it outlines and comments on the procedures to compute multiversal statistics. Section II presents the historical and social context within occurred key epistemological innovations contributing to the development of the theories behind multiversal methods. The section argues why and what these advancements drew from the epistemology of misinformation (‘bullshit epistemology’) for a sense of urgency for remedies to some enduring issues in scientific production: publication bias and p-hacking. Section III is a comment over two relevant works within paradigm of Open Science, to outline the limitations and challenges of this framework.

Suggested Citation

  • Cantone, Giulio Giacomo, 2023. "The multiversal methodology as a remedy of the replication crisis," MetaArXiv kuhmz, Center for Open Science.
  • Handle: RePEc:osf:metaar:kuhmz
    DOI: 10.31219/osf.io/kuhmz
    as

    Download full text from publisher

    File URL: https://osf.io/download/6449c9093848536ac6495ab3/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/kuhmz?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Maren Duvendack & Richard Palmer-Jones & W. Robert Reed, 2017. "What Is Meant by "Replication" and Why Does It Encounter Resistance in Economics?," American Economic Review, American Economic Association, vol. 107(5), pages 46-51, May.
    2. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    3. Guido W. Imbens, 2021. "Statistical Significance, p-Values, and the Reporting of Uncertainty," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 157-174, Summer.
    4. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    5. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    6. Erik W. van Zwet & Eric A. Cator, 2021. "The significance filter, the winner's curse and the need to shrink," Statistica Neerlandica, Netherlands Society for Statistics and Operations Research, vol. 75(4), pages 437-452, November.
    7. Cristobal Young, 2019. "The Difference Between Causal Analysis and Predictive Models: Response to “Comment on Young and Holsteen (2017)â€," Sociological Methods & Research, , vol. 48(2), pages 431-447, May.
    8. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    9. Lars Leszczensky & Tobias Wolbring, 2022. "How to Deal With Reverse Causality Using Panel Data? Recommendations for Researchers Based on a Simulation Study," Sociological Methods & Research, , vol. 51(2), pages 837-865, May.
    10. Horton, Joanne & Krishna Kumar, Dhanya & Wood, Anthony, 2020. "Detecting academic fraud using Benford law: The case of Professor James Hunton," Research Policy, Elsevier, vol. 49(8).
    11. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    12. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    13. Leamer, Edward E, 1985. "Sensitivity Analyses Would Help," American Economic Review, American Economic Association, vol. 75(3), pages 308-313, June.
    14. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    15. Adam Slez, 2019. "The Difference Between Instability and Uncertainty: Comment on Young and Holsteen (2017)," Sociological Methods & Research, , vol. 48(2), pages 400-430, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    4. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    5. Bull, Charles & Courty, Pascal & Doyon, Maurice & Rondeau, Daniel, 2019. "Failure of the Becker–DeGroot–Marschak mechanism in inexperienced subjects: New tests of the game form misconception hypothesis," Journal of Economic Behavior & Organization, Elsevier, vol. 159(C), pages 235-253.
    6. Christophe Pérignon & Olivier Akmansoy & Christophe Hurlin & Anna Dreber & Felix Holzmeister & Juergen Huber & Magnus Johanneson & Michael Kirchler & Albert Menkveld & Michael Razen & Utz Weitzel, 2022. "Reproducibility of Empirical Results: Evidence from 1,000 Tests in Finance," Working Papers hal-03810013, HAL.
    7. Huber, Christoph & Kirchler, Michael, 2023. "Experiments in finance: A survey of historical trends," Journal of Behavioral and Experimental Finance, Elsevier, vol. 37(C).
    8. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    9. Thibaut Arpinon & Marianne Lefebvre, 2024. "Registered Reports and Associated Benefits for Agricultural Economics," Post-Print hal-04635986, HAL.
    10. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    11. Sarstedt, Marko & Adler, Susanne J., 2023. "An advanced method to streamline p-hacking," Journal of Business Research, Elsevier, vol. 163(C).
    12. Schweinsberg, Martin & Feldman, Michael & Staub, Nicola & van den Akker, Olmo R. & van Aert, Robbie C.M. & van Assen, Marcel A.L.M. & Liu, Yang & Althoff, Tim & Heer, Jeffrey & Kale, Alex & Mohamed, Z, 2021. "Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis," Organizational Behavior and Human Decision Processes, Elsevier, vol. 165(C), pages 228-249.
    13. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    14. Hensel, Przemysław G., 2021. "Reproducibility and replicability crisis: How management compares to psychology and economics – A systematic review of literature," European Management Journal, Elsevier, vol. 39(5), pages 577-594.
    15. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArXiv 2b5k4, Center for Open Science.
    16. Brinkerink, Jasper & De Massis, Alfredo & Kellermanns, Franz, 2022. "One finding is no finding: Toward a replication culture in family business research," Journal of Family Business Strategy, Elsevier, vol. 13(4).
    17. Sébastien Duchêne & Adrien Nguyen-Huu & Dimitri Dubois & Marc Willinger, 2022. "Risk-return trade-offs in the context of environmental impact: a lab-in-the-field experiment with finance professionals," CEE-M Working Papers hal-03883121, CEE-M, Universtiy of Montpellier, CNRS, INRA, Montpellier SupAgro.
    18. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    19. Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
    20. Cantone, Giulio Giacomo & Tomaselli, Venera, 2024. "On the Coherence of Composite Indexes: Multiversal Model and Specification Analysis for an Index of Well-Being," MetaArXiv d5y26, Center for Open Science.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:kuhmz. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.