IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/2000391.html

Animal Study Registries: Results from a Stakeholder Analysis on Potential Strengths, Weaknesses, Facilitators, and Barriers

Author

Listed:
  • Susanne Wieschowski
  • Diego S Silva
  • Daniel Strech

Abstract

Publication bias in animal research, its extent, its predictors, and its potential countermeasures are increasingly discussed. Recent reports and conferences highlight the potential strengths of animal study registries (ASRs) in this regard. Others have warned that prospective registration of animal studies could diminish creativity, add administrative burdens, and complicate intellectual property issues in translational research. A literature review and 21 international key-informant interviews were conducted and thematically analyzed to develop a comprehensive matrix of main- and subcategories for potential ASR-related strengths, weaknesses, facilitators, and barriers (SWFBs). We identified 130 potential SWFBs. All stakeholder groups agreed that ASRs could in various ways improve the quality and refinement of animal studies while allowing their number to be reduced, as well as supporting meta-research on animal studies. However, all stakeholder groups also highlighted the potential for theft of ideas, higher administrative burdens, and reduced creativity and serendipity in animal studies. Much more detailed reasoning was captured in the interviews than is currently found in the literature, providing a comprehensive account of the issues and arguments around ASRs. All stakeholder groups highlighted compelling potential strengths of ASRs. Although substantial weaknesses and implementation barriers were highlighted as well, different governance measures might help to minimize or even eliminate their impact. Such measures might include confidentiality time frames for accessing prospectively registered protocols, harmonized reporting requirements across ASRs, ethics reviews, lab notebooks, and journal submissions. The comprehensive information gathered in this study could help to guide a more evidence-based debate and to design pilot tests for ASRs.Author Summary: The manifold contributions over the last years on “publication bias” and “reproducibility crisis” in animal research initiated a debate on whether and how prospective animal study registries (ASRs) should be established in analogy to clinical trial registries. All recent debate, however, followed rather broad lines of argumentation and concluded that future decision-making on the issue of ASRs depends strongly on better knowledge about relevant characteristics of ASRs and about conflicting stakeholder interests. More qualitative but systematically developed evidence in this regard is needed. The primary objective of this study, therefore, was to present a systematically derived spectrum of all relevant strengths, weaknesses, facilitators and barriers (SWFBs) for ASRs. A systematic literature review and 21 key-informant interviews with experts from preclinical and clinical research, industry, and regulatory bodies were conducted to fulfill this objective. Our investigations resulted in a comprehensive and structured account of 130 issues and arguments around ASRs. Future debate and decision-making on ASRs might be heavily influenced by arguments and reasoning from individual experts and thus result in “eminence-based” policy making that relies on expert opinion. This study’s comprehensive spectrum of arguments and issues around ASR, developed through systematic and transparent methods, helps to balance the ongoing debate and thus facilitate a more evidence-based policy making.

Suggested Citation

  • Susanne Wieschowski & Diego S Silva & Daniel Strech, 2016. "Animal Study Registries: Results from a Stakeholder Analysis on Potential Strengths, Weaknesses, Facilitators, and Barriers," PLOS Biology, Public Library of Science, vol. 14(11), pages 1-12, November.
  • Handle: RePEc:plo:pbio00:2000391
    DOI: 10.1371/journal.pbio.2000391
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2000391
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.2000391&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.2000391?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. H Bart van der Worp & David W Howells & Emily S Sena & Michelle J Porritt & Sarah Rewell & Victoria O'Collins & Malcolm R Macleod, 2010. "Can Animal Models of Disease Reliably Inform Human Studies?," PLOS Medicine, Public Library of Science, vol. 7(3), pages 1-8, March.
    2. Emily S Sena & H Bart van der Worp & Philip M W Bath & David W Howells & Malcolm R Macleod, 2010. "Publication Bias in Reports of Animal Stroke Studies Leads to Major Overstatement of Efficacy," PLOS Biology, Public Library of Science, vol. 8(3), pages 1-8, March.
    3. Konstantinos K Tsilidis & Orestis A Panagiotou & Emily S Sena & Eleni Aretouli & Evangelos Evangelou & David W Howells & Rustam Al-Shahi Salman & Malcolm R Macleod & John P A Ioannidis, 2013. "Evaluation of Excess Significance Bias in Animal Studies of Neurological Diseases," PLOS Biology, Public Library of Science, vol. 11(7), pages 1-10, July.
    4. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    2. Susanne Wieschowski & Hans Laser & Emily S Sena & André Bleich & René Tolba & Daniel Strech, 2020. "Attitudes towards animal study registries and their characteristics: An online survey of three cohorts of animal researchers," PLOS ONE, Public Library of Science, vol. 15(1), pages 1-15, January.
    3. Faust, Alice & Woydack, Lena & Strech, Daniel, 2023. "Should the governance of individual treatment attempts (“Individuelle Heilversuche”) include praxis evaluation? Results from qualitative stakeholder interviews," Health Policy, Elsevier, vol. 130(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Constance Holman & Sophie K Piper & Ulrike Grittner & Andreas Antonios Diamantaras & Jonathan Kimmelman & Bob Siegerink & Ulrich Dirnagl, 2016. "Where Have All the Rodents Gone? The Effects of Attrition in Experimental Research on Cancer and Stroke," PLOS Biology, Public Library of Science, vol. 14(1), pages 1-12, January.
    2. repec:plo:pbio00:1001757 is not listed on IDEAS
    3. repec:plo:pbio00:1000413 is not listed on IDEAS
    4. repec:plo:pmed00:1001489 is not listed on IDEAS
    5. Konstantinos K Tsilidis & Orestis A Panagiotou & Emily S Sena & Eleni Aretouli & Evangelos Evangelou & David W Howells & Rustam Al-Shahi Salman & Malcolm R Macleod & John P A Ioannidis, 2013. "Evaluation of Excess Significance Bias in Animal Studies of Neurological Diseases," PLOS Biology, Public Library of Science, vol. 11(7), pages 1-10, July.
    6. Nathalie Percie du Sert & Viki Hurst & Amrita Ahluwalia & Sabina Alam & Marc T Avey & Monya Baker & William J Browne & Alejandra Clark & Innes C Cuthill & Ulrich Dirnagl & Michael Emerson & Paul Garne, 2020. "The ARRIVE guidelines 2.0: Updated guidelines for reporting animal research," PLOS Biology, Public Library of Science, vol. 18(7), pages 1-12, July.
    7. Carol Kilkenny & William J Browne & Innes C Cuthill & Michael Emerson & Douglas G Altman, 2010. "Improving Bioscience Research Reporting: The ARRIVE Guidelines for Reporting Animal Research," PLOS Biology, Public Library of Science, vol. 8(6), pages 1-5, June.
    8. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    9. repec:plo:pbio00:1001716 is not listed on IDEAS
    10. repec:plo:pbio00:2000598 is not listed on IDEAS
    11. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    12. repec:plo:pbio00:1001863 is not listed on IDEAS
    13. repec:plo:pone00:0106108 is not listed on IDEAS
    14. repec:plo:pmed00:1001010 is not listed on IDEAS
    15. repec:plo:pbio00:1002273 is not listed on IDEAS
    16. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    17. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 48(1), pages 62-83.
    18. Hussinger, Katrin & Pellens, Maikel, 2019. "Guilt by association: How scientific misconduct harms prior collaborators," Research Policy, Elsevier, vol. 48(2), pages 516-530.
    19. Lauralyn A McIntyre & David Moher & Dean A Fergusson & Katrina J Sullivan & Shirley H J Mei & Manoj Lalu & John Marshall & Malcolm Mcleod & Gilly Griffin & Jeremy Grimshaw & Alexis Turgeon & Marc T Av, 2016. "Efficacy of Mesenchymal Stromal Cell Therapy for Acute Lung Injury in Preclinical Animal Models: A Systematic Review," PLOS ONE, Public Library of Science, vol. 11(1), pages 1-16, January.
    20. Yan Li & Xiang Zhou & Rui Chen & Xianyang Zhang & Hongyuan Cao, 2024. "STAREG: Statistical replicability analysis of high throughput experiments with applications to spatial transcriptomic studies," PLOS Genetics, Public Library of Science, vol. 20(10), pages 1-19, October.
    21. Andreoli-Versbach, Patrick & Mueller-Langer, Frank, 2014. "Open access to data: An ideal professed but not practised," Research Policy, Elsevier, vol. 43(9), pages 1621-1633.
    22. Kimberley E Wever & Carlijn R Hooijmans & Niels P Riksen & Thomas B Sterenborg & Emily S Sena & Merel Ritskes-Hoitinga & Michiel C Warlé, 2015. "Determinants of the Efficacy of Cardiac Ischemic Preconditioning: A Systematic Review and Meta-Analysis of Animal Studies," PLOS ONE, Public Library of Science, vol. 10(11), pages 1-17, November.
    23. Coupé, Tom & Reed, W. Robert & Zimmermann, Christian, 2023. "Getting seen: Results from an online experiment to draw more attention to replications," Research Policy, Elsevier, vol. 52(8).
    24. Joanna Chataway & Sarah Parks & Elta Smith, 2017. "How Will Open Science Impact on University-Industry Collaboration?," Foresight and STI Governance, National Research University Higher School of Economics, vol. 11(2), pages 44-53.
    25. Robyn M. Lucas & Rachael M. Rodney Harris, 2018. "On the Nature of Evidence and ‘Proving’ Causality: Smoking and Lung Cancer vs. Sun Exposure, Vitamin D and Multiple Sclerosis," IJERPH, MDPI, vol. 15(8), pages 1-13, August.
    26. van Aert, Robbie Cornelis Maria, 2018. "Dissertation R.C.M. van Aert," MetaArXiv eqhjd, Center for Open Science.
    27. Peter Harremoës, 2019. "Replication Papers," Publications, MDPI, vol. 7(3), pages 1-8, July.
    28. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    29. Mark J. McCabe & Frank Mueller-Langer, 2019. "Does Data Disclosure Increase Citations? Empirical Evidence from a Natural Experiment in Leading Economics Journals," JRC Working Papers on Digital Economy 2019-02, Joint Research Centre.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:2000391. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.