IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/7uskw_v1.html

Mapping the Open-Source Landscape: A Systematic Mapping and Ecosystem Analysis of Evidence Synthesis Software

Author

Listed:
  • Sahu, Vihaan

Abstract

Objective: The evidence synthesis software landscape is defined by the tension between the reproducibility of open-source software (OSS) and the usability of proprietary "black box" solutions. While OSS is critical for transparency, its structural dynamics remain unquantified. This study systematically mapped and analyzed the OSS ecosystem to characterize its growth, technological stratification, and structural gaps. Methods: A systematic mapping of Google Scholar, PubMed, CRAN, and GitHub (inception to February 2026) identified 277 OSS tools meeting strict criteria for public repositories and OSI-approved licenses. These were compared to 90 proprietary tools. Analysis utilized Quasi-Poisson regression to model innovation rates, Chi- square tests for workflow-language dependencies, and descriptive analysis for ecosystem architectures. Results: The market is dichotomous: proprietary tools predominantly featured "integrated platforms" (37.8%), compared to their near absence in OSS (0.7%). Quasi-Poisson regression (2000–2025) revealed significant OSS development acceleration (Incidence Rate Ratio [IRR] = 1.158, 95% CI 1.12–1.20), driven by Python-based machine learning tools. A significant association was found between programming language and workflow stage (χ2 = 122.4, df=48, p < 0.001), with Python dominating screening/extraction and R dominating analysis. Conclusion: The ecosystem is structurally split between monolithic proprietary suites and a modular, rapidly expanding OSS architecture. Although OSS is maturing, it suffers from fragmentation and a "modularity gap", a lack of integrated user interfaces. This study provides a quantitative framework for research infrastructure, highlighting the need for interoperability standards to bridge the OSS gap and support fully reproducible workflows.

Suggested Citation

  • Sahu, Vihaan, 2026. "Mapping the Open-Source Landscape: A Systematic Mapping and Ecosystem Analysis of Evidence Synthesis Software," MetaArXiv 7uskw_v1, Center for Open Science.
  • Handle: RePEc:osf:metaar:7uskw_v1
    DOI: 10.31219/osf.io/7uskw_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/69a6f919df28a2d9c4e7d1ac/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/7uskw_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Nosek, BA & Alter, G & Banks, GC & Borsboom, D & Bowman, SD & Breckler, SJ & Buck, S & Chambers, CD & Chin, G & Christensen, G & Contestabile, M & Dafoe, A & Eich, E & Freese, J & Glennerster, R & Gor, 2015. "Promoting an open research culture," Department of Economics, Working Paper Series qt7wh1000s, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Caicedo, Juan D. & Guirado, Carlos & González, Marta C. & Walker, Joan L., 2025. "Sharing, collaborating, and benchmarking to advance travel demand research: A demonstration of short-term ridership prediction," Transport Policy, Elsevier, vol. 171(C), pages 531-541.
    2. Rahal, Rima-Maria, 2025. "Advancing openness in economic research through the lens of behavioral and experimental economics," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 119(C).
    3. Florian Jeserich & Constantin Klein & Benno Brinkhaus & Michael Teut, 2023. "Sense of coherence and religion/spirituality: A systematic review and meta-analysis based on a methodical classification of instruments measuring religion/spirituality," PLOS ONE, Public Library of Science, vol. 18(8), pages 1-43, August.
    4. Tom L. Dudda & Lars Hornuf, 2025. "The Perks and Perils of Machine Learning in Business and Economic Research," CESifo Working Paper Series 11721, CESifo.
    5. Sandeep Neela, 2026. "An Explainable Market Integrity Monitoring System with Multi-Source Attention Signals and Transparent Scoring," Papers 2601.15304, arXiv.org.
    6. repec:plo:pbio00:1002456 is not listed on IDEAS
    7. Julian N. Marewski & Ulrich Hoffrage, 2025. "Heuristics: how simple models of the mind can serve as tools for transparent scientific justification," Mind & Society: Cognitive Studies in Economics and Social Sciences, Springer;Fondazione Rosselli, vol. 24(2), pages 947-998, December.
    8. Weilun Wu & W. Robert Reed, 2025. "Meta-Analyses in Management and Marketing: An Assessment," Working Papers in Economics 25/16, University of Canterbury, Department of Economics and Finance.
    9. Wei Yu & Junpeng Chen & Sanhong Deng, 2024. "Open Science Under Debate: Disentangling the Interest on Twitter and Scholarly Research," SAGE Open, , vol. 14(3), pages 21582440241, August.
    10. Joëts, Marc & Mignon, Valérie, 2026. "Slaying the undead: How long does it take to kill zombie papers?," Research Policy, Elsevier, vol. 55(2).
    11. Konrad Turek, 2025. "Accelerating social science knowledge production with the coordinated open-source model," Quality & Quantity: International Journal of Methodology, Springer, vol. 59(2), pages 767-795, April.
    12. repec:osf:osfxxx:em9ua_v1 is not listed on IDEAS
    13. repec:osf:metaar:a8gu5_v1 is not listed on IDEAS
    14. Lohse, Johannes & Rahal, Rima-Maria & Schulte-Mecklenbeck, Michael & Sofianos, Andis & Wollbrant, Conny, 2024. "Investigations of decision processes at the intersection of psychology and economics," Journal of Economic Psychology, Elsevier, vol. 103(C).
    15. Oxley, Florence A.R. & Wilding, Kirsty & von Stumm, Sophie, 2024. "DNA and IQ: Big deal or much ado about nothing? – A meta-analysis," Intelligence, Elsevier, vol. 107(C).
    16. Janik Goltermann & Nils R. Winter & Susanne Meinert & Dominik Grotegerd & Anna Kraus & Kira Flinkenflügel & Luisa Altegoer & Judith Krieger & Elisabeth J. Leehr & Joscha Böhnlein & Linda M. Bonnekoh &, 2025. "Gray matter correlates of childhood maltreatment lack replicability in a multi-cohort brain-wide association study," Nature Communications, Nature, vol. 16(1), pages 1-14, December.
    17. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawRxiv 2b5k4, Center for Open Science.
    18. Henian Chen & Yayi Zhao & Biwei Cao & Donna J Petersen & Matthew J Valente & Weiliang Cen, 2024. "Breaking the silence of sharing data in medical research," PLOS ONE, Public Library of Science, vol. 19(5), pages 1-9, May.
    19. G. H. B. A. de Silva, 2025. "Data-Driven Framework for Aligning Artificial Intelligence with Inclusive Development in the Global South," Sustainability, MDPI, vol. 17(21), pages 1-20, October.
    20. David Moreau & Kristina Wiebels, 2024. "Nine quick tips for open meta-analyses," PLOS Computational Biology, Public Library of Science, vol. 20(7), pages 1-14, July.
    21. Balafoutas, Loukas & Celse, Jeremy & Karakostas, Alexandros & Umashev, Nicholas, 2025. "Incentives and the replication crisis in social sciences: A critical review of open science practices," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 114(C).
    22. Vanessa Tabea von Kortzfleisch & Oliver Ambrée & Natasha A Karp & Neele Meyer & Janja Novak & Rupert Palme & Marianna Rosso & Chadi Touma & Hanno Würbel & Sylvia Kaiser & Norbert Sachser & S Helene Ri, 2022. "Do multiple experimenters improve the reproducibility of animal studies?," PLOS Biology, Public Library of Science, vol. 20(5), pages 1-21, May.
    23. repec:osf:metaar:qkjy4_v1 is not listed on IDEAS
    24. Mussel, Patrick & Schäpers, Philipp & Schulte, Niklas & Hewig, Johannes & Krumm, Stefan, 2025. "The Short And Free Reasoning Ability assessmeNt (SAFRAN): Multidimensional, publicly available and only 15 minutes testing time," Intelligence, Elsevier, vol. 112(C).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:7uskw_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.