IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0026828.html
   My bibliography  Save this article

Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results

Author

Listed:
  • Jelte M Wicherts
  • Marjan Bakker
  • Dylan Molenaar

Abstract

Background: The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically. Methods and Findings: We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect) and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance. Conclusions: Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies.

Suggested Citation

  • Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
  • Handle: RePEc:plo:pone00:0026828
    DOI: 10.1371/journal.pone.0026828
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0026828
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0026828&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0026828?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Strasak, Alexander M. & Zaman, Qamruz & Marinell, Gerhard & Pfeiffer, Karl P. & Ulmer, Hanno, 2007. "[MEDICINE] The Use of Statistics in Medical Research: A Comparison of The New England Journal of Medicine and Nature Medicine," The American Statistician, American Statistical Association, vol. 61, pages 47-55, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Coosje L S Veldkamp & Michèle B Nuijten & Linda Dominguez-Alvarez & Marcel A L M van Assen & Jelte M Wicherts, 2014. "Statistical Reporting Errors and Collaboration on Statistical Analyses in Psychological Science," PLOS ONE, Public Library of Science, vol. 9(12), pages 1-19, December.
    2. Benjamin D K Wood & Rui Müller & Annette N Brown, 2018. "Push button replication: Is impact evaluation evidence for international development verifiable?," PLOS ONE, Public Library of Science, vol. 13(12), pages 1-15, December.
    3. Keiko Kurata & Mamiko Matsubayashi & Shinji Mine, 2017. "Identifying the Complex Position of Research Data and Data Sharing Among Researchers in Natural Science," SAGE Open, , vol. 7(3), pages 21582440177, July.
    4. Antonia Krefeld-Schwalb & Benjamin Scheibehenne, 2023. "Tighter nets for smaller fishes? Mapping the development of statistical practices in consumer research between 2008 and 2020," Marketing Letters, Springer, vol. 34(3), pages 351-365, September.
    5. Michal Krawczyk & Ernesto Reuben, 2012. "(Un)Available upon Request: Field Experiment on Researchers' Willingness to Share Supplementary Materials," Natural Field Experiments 00689, The Field Experiments Website.
    6. Pfenninger, Stefan & DeCarolis, Joseph & Hirth, Lion & Quoilin, Sylvain & Staffell, Iain, 2017. "The importance of open data and software: Is energy research lagging behind?," Energy Policy, Elsevier, vol. 101(C), pages 211-215.
    7. Klaas Sijtsma, 2016. "Playing with Data—Or How to Discourage Questionable Research Practices and Stimulate Researchers to Do Things Right," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 1-15, March.
    8. Kraft-Todd, Gordon T. & Rand, David G., 2021. "Practice what you preach: Credibility-enhancing displays and the growth of open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 164(C), pages 1-10.
    9. Dominique G Roche & Loeske E B Kruuk & Robert Lanfear & Sandra A Binning, 2015. "Public Data Archiving in Ecology and Evolution: How Well Are We Doing?," PLOS Biology, Public Library of Science, vol. 13(11), pages 1-12, November.
    10. Irwin D. Waldman & Scott O. Lilienfeld, 2016. "Thinking About Data, Research Methods, and Statistical Analyses: Commentary on Sijtsma’s (2014) “Playing with Data”," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 16-26, March.
    11. Esther Maassen & Marcel A L M van Assen & Michèle B Nuijten & Anton Olsson-Collentine & Jelte M Wicherts, 2020. "Reproducibility of individual effect sizes in meta-analyses in psychology," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-18, May.
    12. Klaas Sijtsma, 2016. "Playing with Data—Or How to Discourage Questionable Research Practices and Stimulate Researchers to Do Things Right," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 1-15, March.
    13. Irwin Waldman & Scott Lilienfeld, 2016. "Thinking About Data, Research Methods, and Statistical Analyses: Commentary on Sijtsma’s (2014) “Playing with Data”," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 16-26, March.
    14. Giorgi, Francesca, 2021. "A new method to explore inferential risks associated with each study in a meta-analysis: An approach based on Design Analysis," Thesis Commons n5y8b, Center for Open Science.
    15. Peter Pütz & Stephan B. Bruns, 2021. "The (Non‐)Significance Of Reporting Errors In Economics: Evidence From Three Top Journals," Journal of Economic Surveys, Wiley Blackwell, vol. 35(1), pages 348-373, February.
    16. Aguinis, Herman & Banks, George C. & Rogelberg, Steven G. & Cascio, Wayne F., 2020. "Actionable recommendations for narrowing the science-practice gap in open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 158(C), pages 27-35.
    17. Hensel, Przemysław G., 2021. "Reproducibility and replicability crisis: How management compares to psychology and economics – A systematic review of literature," European Management Journal, Elsevier, vol. 39(5), pages 577-594.
    18. Ryan P Womack, 2015. "Research Data in Core Journals in Biology, Chemistry, Mathematics, and Physics," PLOS ONE, Public Library of Science, vol. 10(12), pages 1-22, December.
    19. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    20. Bennett Kleinberg & Bruno Verschuere, 2015. "Memory Detection 2.0: The First Web-Based Memory Detection Test," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-17, April.
    21. Yulin Yu & Daniel M. Romero, 2024. "Does the Use of Unusual Combinations of Datasets Contribute to Greater Scientific Impact?," Papers 2402.05024, arXiv.org, revised Feb 2024.
    22. Matteo Colombo & Georgi Duev & Michèle B Nuijten & Jan Sprenger, 2018. "Statistical reporting inconsistencies in experimental philosophy," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-12, April.
    23. Colombo, Matteo & Duev, Georgi & Nuijten, M.B. & Sprenger, Jan, 2018. "Statistical reporting inconsistencies in experimental philosophy," Other publications TiSEM 075f5696-ae1a-4aae-9e17-c, Tilburg University, School of Economics and Management.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tracey L Weissgerber & Vesna D Garovic & Jelena S Milin-Lazovic & Stacey J Winham & Zoran Obradovic & Jerome P Trzeciakowski & Natasa M Milic, 2016. "Reinventing Biostatistics Education for Basic Scientists," PLOS Biology, Public Library of Science, vol. 14(4), pages 1-12, April.
    2. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    3. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    4. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    5. Karin Langenkamp & Bodo Rödel & Kerstin Taufenbach & Meike Weiland, 2018. "Open Access in Vocational Education and Training Research," Publications, MDPI, vol. 6(3), pages 1-12, July.
    6. Kevin J. Boyle & Mark Morrison & Darla Hatton MacDonald & Roderick Duncan & John Rose, 2016. "Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 64(3), pages 401-419, July.
    7. Valentine, Kathrene D & Buchanan, Erin Michelle & Scofield, John E. & Beauchamp, Marshall T., 2017. "Beyond p-values: Utilizing Multiple Estimates to Evaluate Evidence," OSF Preprints 9hp7y, Center for Open Science.
    8. Anton, Roman, 2014. "Sustainable Intrapreneurship - The GSI Concept and Strategy - Unfolding Competitive Advantage via Fair Entrepreneurship," MPRA Paper 69713, University Library of Munich, Germany, revised 01 Feb 2015.
    9. Dudek, Thomas & Brenøe, Anne Ardila & Feld, Jan & Rohrer, Julia, 2022. "No Evidence That Siblings' Gender Affects Personality across Nine Countries," IZA Discussion Papers 15137, Institute of Labor Economics (IZA).
    10. Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.
    11. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    12. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    13. Aurelie Seguin & Wolfgang Forstmeier, 2012. "No Band Color Effects on Male Courtship Rate or Body Mass in the Zebra Finch: Four Experiments and a Meta-Analysis," PLOS ONE, Public Library of Science, vol. 7(6), pages 1-11, June.
    14. Ankur Moitra & Dhruv Rohatgi, 2022. "Provably Auditing Ordinary Least Squares in Low Dimensions," Papers 2205.14284, arXiv.org, revised Jun 2022.
    15. Dragana Radicic & Geoffrey Pugh & Hugo Hollanders & René Wintjes & Jon Fairburn, 2016. "The impact of innovation support programs on small and medium enterprises innovation in traditional manufacturing industries: An evaluation for seven European Union regions," Environment and Planning C, , vol. 34(8), pages 1425-1452, December.
    16. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    17. Li, Lunzheng & Maniadis, Zacharias & Sedikides, Constantine, 2021. "Anchoring in Economics: A Meta-Analysis of Studies on Willingness-To-Pay and Willingness-To-Accept," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 90(C).
    18. Eric van Diessen & Willemiek J E M Zweiphenning & Floor E Jansen & Cornelis J Stam & Kees P J Braun & Willem M Otte, 2014. "Brain Network Organization in Focal Epilepsy: A Systematic Review and Meta-Analysis," PLOS ONE, Public Library of Science, vol. 9(12), pages 1-21, December.
    19. Charles F. Manski, 2018. "Reasonable patient care under uncertainty," Health Economics, John Wiley & Sons, Ltd., vol. 27(10), pages 1397-1421, October.
    20. Kathryn Oliver & Annette Boaz, 2019. "Transforming evidence for policy and practice: creating space for new conversations," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-10, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0026828. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.