IDEAS home Printed from https://ideas.repec.org/a/nas/journl/v115y2018p2584-2589.html
   My bibliography  Save this article

An empirical analysis of journal policy effectiveness for computational reproducibility

Author

Listed:
  • Victoria Stodden

    (School of Information Sciences, University of Illinois at Urbana–Champaign, Champaign, IL 61820)

  • Jennifer Seiler

    (Department of Statistics, Columbia University, New York, NY 10027)

  • Zhaokun Ma

    (Department of Statistics, Columbia University, New York, NY 10027)

Abstract

A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy—author remission of data and code postpublication upon request—an improvement over no policy, but currently insufficient for reproducibility.

Suggested Citation

  • Victoria Stodden & Jennifer Seiler & Zhaokun Ma, 2018. "An empirical analysis of journal policy effectiveness for computational reproducibility," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(11), pages 2584-2589, March.
  • Handle: RePEc:nas:journl:v:115:y:2018:p:2584-2589
    as

    Download full text from publisher

    File URL: http://www.pnas.org/content/115/11/2584.full
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Thu-Mai Christian & Amanda Gooch & Todd Vision & Elizabeth Hull, 2020. "Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-15, March.
    2. Christophe Hurlin & Christophe Pérignon, 2020. "Reproducibility Certification in Economics Research," Working Papers hal-02896404, HAL.
    3. Ruiz-Benito, Paloma & Vacchiano, Giorgio & Lines, Emily R. & Reyer, Christopher P.O. & Ratcliffe, Sophia & Morin, Xavier & Hartig, Florian & Mäkelä, Annikki & Yousefpour, Rasoul & Chaves, Jimena E. & , 2020. "Available and missing data to model impact of climate change on European forests," Ecological Modelling, Elsevier, vol. 416(C).
    4. Ana Trisovic & Katherine Mika & Ceilyn Boyd & Sebastian Feger & Mercè Crosas, 2021. "Repository Approaches to Improving the Quality of Shared Data and Code," Data, MDPI, vol. 6(2), pages 1-12, February.
    5. Shawn J Leroux, 2019. "On the prevalence of uninformative parameters in statistical models applying model selection in applied ecology," PLOS ONE, Public Library of Science, vol. 14(2), pages 1-12, February.
    6. Heidi Seibold & Severin Czerny & Siona Decke & Roman Dieterle & Thomas Eder & Steffen Fohr & Nico Hahn & Rabea Hartmann & Christoph Heindl & Philipp Kopper & Dario Lepke & Verena Loidl & Maximilian Ma, 2021. "A computational reproducibility study of PLOS ONE articles featuring longitudinal data analyses," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-15, June.
    7. Antonio Páez, 2021. "Open spatial sciences: an introduction," Journal of Geographical Systems, Springer, vol. 23(4), pages 467-476, October.
    8. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    9. Sara Stoudt & Váleri N Vásquez & Ciera C Martinez, 2021. "Principles for data analysis workflows," PLOS Computational Biology, Public Library of Science, vol. 17(3), pages 1-26, March.
    10. Jessica L Couture & Rachael E Blake & Gavin McDonald & Colette L Ward, 2018. "A funder-imposed data publication requirement seldom inspired data sharing," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-13, July.
    11. Schweinsberg, Martin & Feldman, Michael & Staub, Nicola & van den Akker, Olmo R. & van Aert, Robbie C.M. & van Assen, Marcel A.L.M. & Liu, Yang & Althoff, Tim & Heer, Jeffrey & Kale, Alex & Mohamed, Z, 2021. "Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis," Organizational Behavior and Human Decision Processes, Elsevier, vol. 165(C), pages 228-249.
    12. Nikolas I Krieger & Adam T Perzynski & Jarrod E Dalton, 2019. "Facilitating reproducible project management and manuscript development in team science: The projects R package," PLOS ONE, Public Library of Science, vol. 14(7), pages 1-9, July.
    13. Rat für Sozial- und Wirtschaftsdaten RatSWD (ed.), 2023. "Erhebung und Nutzung unstrukturierter Daten in den Sozial-, Verhaltens- und Wirtschaftswissenschaften," RatSWD Output Series, German Data Forum (RatSWD), volume 7, number 7-2de.
    14. Vlaeminck, Sven, 2021. "Dawning of a New Age? Economics Journals’ Data Policies on the Test Bench," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 31(1), pages 1-29.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nas:journl:v:115:y:2018:p:2584-2589. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Eric Cain (email available below). General contact details of provider: http://www.pnas.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.