IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0165999.html
   My bibliography  Save this article

The Researchers’ View of Scientific Rigor—Survey on the Conduct and Reporting of In Vivo Research

Author

Listed:
  • Thomas S Reichlin
  • Lucile Vogt
  • Hanno Würbel

Abstract

Reproducibility in animal research is alarmingly low, and a lack of scientific rigor has been proposed as a major cause. Systematic reviews found low reporting rates of measures against risks of bias (e.g., randomization, blinding), and a correlation between low reporting rates and overstated treatment effects. Reporting rates of measures against bias are thus used as a proxy measure for scientific rigor, and reporting guidelines (e.g., ARRIVE) have become a major weapon in the fight against risks of bias in animal research. Surprisingly, animal scientists have never been asked about their use of measures against risks of bias and how they report these in publications. Whether poor reporting reflects poor use of such measures, and whether reporting guidelines may effectively reduce risks of bias has therefore remained elusive. To address these questions, we asked in vivo researchers about their use and reporting of measures against risks of bias and examined how self-reports relate to reporting rates obtained through systematic reviews. An online survey was sent out to all registered in vivo researchers in Switzerland (N = 1891) and was complemented by personal interviews with five representative in vivo researchers to facilitate interpretation of the survey results. Return rate was 28% (N = 530), of which 302 participants (16%) returned fully completed questionnaires that were used for further analysis. According to the researchers’ self-report, they use measures against risks of bias to a much greater extent than suggested by reporting rates obtained through systematic reviews. However, the researchers’ self-reports are likely biased to some extent. Thus, although they claimed to be reporting measures against risks of bias at much lower rates than they claimed to be using these measures, the self-reported reporting rates were considerably higher than reporting rates found by systematic reviews. Furthermore, participants performed rather poorly when asked to choose effective over ineffective measures against six different biases. Our results further indicate that knowledge of the ARRIVE guidelines had a positive effect on scientific rigor. However, the ARRIVE guidelines were known by less than half of the participants (43.7%); and among those whose latest paper was published in a journal that had endorsed the ARRIVE guidelines, more than half (51%) had never heard of these guidelines. Our results suggest that whereas reporting rates may underestimate the true use of measures against risks of bias, self-reports may overestimate it. To a large extent, this discrepancy can be explained by the researchers’ ignorance and lack of knowledge of risks of bias and measures to prevent them. Our analysis thus adds significant new evidence to the assessment of research integrity in animal research. Our findings further question the confidence that the authorities have in scientific rigor, which is taken for granted in the harm-benefit analyses on which approval of animal experiments is based. Furthermore, they suggest that better education on scientific integrity and good research practice is needed. However, they also question reliance on reporting rates as indicators of scientific rigor and highlight a need for more reliable predictors.

Suggested Citation

  • Thomas S Reichlin & Lucile Vogt & Hanno Würbel, 2016. "The Researchers’ View of Scientific Rigor—Survey on the Conduct and Reporting of In Vivo Research," PLOS ONE, Public Library of Science, vol. 11(12), pages 1-20, December.
  • Handle: RePEc:plo:pone00:0165999
    DOI: 10.1371/journal.pone.0165999
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0165999
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0165999&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0165999?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hussinger, Katrin & Pellens, Maikel, 2019. "Guilt by association: How scientific misconduct harms prior collaborators," Research Policy, Elsevier, vol. 48(2), pages 516-530.
    2. Andreoli-Versbach, Patrick & Mueller-Langer, Frank, 2014. "Open access to data: An ideal professed but not practised," Research Policy, Elsevier, vol. 43(9), pages 1621-1633.
    3. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    4. Peter Harremoës, 2019. "Replication Papers," Publications, MDPI, vol. 7(3), pages 1-8, July.
    5. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    6. Mark J. McCabe & Frank Mueller-Langer, 2019. "Does Data Disclosure Increase Citations? Empirical Evidence from a Natural Experiment in Leading Economics Journals," JRC Working Papers on Digital Economy 2019-02, Joint Research Centre.
    7. Nathalie Percie du Sert & Viki Hurst & Amrita Ahluwalia & Sabina Alam & Marc T Avey & Monya Baker & William J Browne & Alejandra Clark & Innes C Cuthill & Ulrich Dirnagl & Michael Emerson & Paul Garne, 2020. "The ARRIVE guidelines 2.0: Updated guidelines for reporting animal research," PLOS Biology, Public Library of Science, vol. 18(7), pages 1-12, July.
    8. Hajko, Vladimír, 2017. "The failure of Energy-Economy Nexus: A meta-analysis of 104 studies," Energy, Elsevier, vol. 125(C), pages 771-787.
    9. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    10. Felix Holzmeister & Magnus Johannesson & Colin F. Camerer & Yiling Chen & Teck-Hua Ho & Suzanne Hoogeveen & Juergen Huber & Noriko Imai & Taisuke Imai & Lawrence Jin & Michael Kirchler & Alexander Ly , 2025. "Examining the replicability of online experiments selected by a decision market," Nature Human Behaviour, Nature, vol. 9(2), pages 316-330, February.
    11. Stylianos Serghiou & Despina G Contopoulos-Ioannidis & Kevin W Boyack & Nico Riedel & Joshua D Wallach & John P A Ioannidis, 2021. "Assessment of transparency indicators across the biomedical literature: How open is open?," PLOS Biology, Public Library of Science, vol. 19(3), pages 1-26, March.
    12. Marlo M Vernon & E Andrew Balas & Shaher Momani, 2018. "Are university rankings useful to improve research? A systematic review," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-15, March.
    13. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.
    14. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    15. Malika Ihle & Isabel S. Winney & Anna Krystalli & Michael Croucher, 2017. "Striving for transparent and credible research: practical guidelines for behavioral ecologists," Behavioral Ecology, International Society for Behavioral Ecology, vol. 28(2), pages 348-354.
    16. Baltussen, Guido & Swinkels, Laurens & Van Vliet, Pim, 2021. "Global factor premiums," Journal of Financial Economics, Elsevier, vol. 142(3), pages 1128-1154.
    17. Owen, P. Dorian, 2018. "Replication to assess statistical adequacy," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-16.
    18. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    19. Ryan C Kennedy & Meir Marmor & Ralph Marcucio & C Anthony Hunt, 2018. "Simulation enabled search for explanatory mechanisms of the fracture healing process," PLOS Computational Biology, Public Library of Science, vol. 14(2), pages 1-32, February.
    20. Fecher, Benedikt & Fräßdorf, Mathis & Hebing, Marcel & Wagner, Gert G., 2017. "Replikationen, Reputation und gute wissenschaftliche Praxis," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 68(2-3), pages 154-158.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0165999. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.