IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0287660.html
   My bibliography  Save this article

Knowledge and motivations of training in peer review: An international cross-sectional survey

Author

Listed:
  • Jessie V Willis
  • Janina Ramos
  • Kelly D Cobey
  • Jeremy Y Ng
  • Hassan Khan
  • Marc A Albert
  • Mohsen Alayche
  • David Moher

Abstract

Background: Despite having a crucial role in scholarly publishing, peer reviewers do not typically require any training. The purpose of this study was to conduct an international survey on the current perceptions and motivations of researchers regarding peer review training. Methods: A cross-sectional online survey was conducted of biomedical researchers. A total of 2000 corresponding authors from 100 randomly selected medical journals were invited via email. Quantitative items were reported using frequencies and percentages or means and SE, as appropriate. A thematic content analysis was conducted for qualitative items in which two researchers independently assigned codes to the responses for each written-text question, and subsequently grouped the codes into themes. A descriptive definition of each category was then created and unique themes–as well as the number and frequency of codes within each theme–were reported. Results: A total of 186 participants completed the survey of which 14 were excluded. The majority of participants indicated they were men (n = 97 of 170, 57.1%), independent researchers (n = 108 of 172, 62.8%), and primarily affiliated with an academic organization (n = 103 of 170, 62.8%). A total of 144 of 171 participants (84.2%) indicated they had never received formal training in peer review. Most participants (n = 128, 75.7%) agreed–of which 41 (32.0%) agreed strongly–that peer reviewers should receive formal training in peer review prior to acting as a peer reviewer. The most preferred training formats were online courses, online lectures, and online modules. Most respondents (n = 111 of 147, 75.5%) stated that difficulty finding and/or accessing training was a barrier to completing training in peer review. Conclusion: Despite being desired, most biomedical researchers have not received formal training in peer review and indicated that training was difficult to access or not available.

Suggested Citation

  • Jessie V Willis & Janina Ramos & Kelly D Cobey & Jeremy Y Ng & Hassan Khan & Marc A Albert & Mohsen Alayche & David Moher, 2023. "Knowledge and motivations of training in peer review: An international cross-sectional survey," PLOS ONE, Public Library of Science, vol. 18(7), pages 1-14, July.
  • Handle: RePEc:plo:pone00:0287660
    DOI: 10.1371/journal.pone.0287660
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0287660
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0287660&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0287660?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. David Moher & Lex Bouter & Sabine Kleinert & Paul Glasziou & Mai Har Sham & Virginia Barbour & Anne-Marie Coriat & Nicole Foeger & Ulrich Dirnagl, 2020. "The Hong Kong Principles for assessing researchers: Fostering research integrity," PLOS Biology, Public Library of Science, vol. 18(7), pages 1-14, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Michaela Strinzel & Josh Brown & Wolfgang Kaltenbrunner & Sarah Rijcke & Michael Hill, 2021. "Ten ways to improve academic CVs for fairer research assessment," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-4, December.
    2. Isidore Komla Zotoo & Guifeng Liu & Zhangping Lu & Frank Kofi Essien & Wencheng Su, 2023. "The Impact of Key Stakeholders and the Computer Skills of Librarians on Research Data Management Support Services (Id so-21-1893.r2)," SAGE Open, , vol. 13(3), pages 21582440231, September.
    3. Ulrich Dirnagl & Nonia Pariente, 2024. "Promoting research quality," PLOS Biology, Public Library of Science, vol. 22(2), pages 1-3, February.
    4. repec:osf:osfxxx:ygakx_v1 is not listed on IDEAS
    5. Gowri Gopalakrishna & Gerben ter Riet & Gerko Vink & Ineke Stoop & Jelte M Wicherts & Lex M Bouter, 2022. "Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in The Netherlands," PLOS ONE, Public Library of Science, vol. 17(2), pages 1-16, February.
    6. Enrique Orduña-Malea & Núria Bautista-Puig, 2024. "Research assessment under debate: disentangling the interest around the DORA declaration on Twitter," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 537-559, January.
    7. Brianne A Kent & Constance Holman & Emmanuella Amoako & Alberto Antonietti & James M Azam & Hanne Ballhausen & Yaw Bediako & Anat M Belasen & Clarissa F D Carneiro & Yen-Chung Chen & Ewoud B Compeer &, 2022. "Recommendations for empowering early career researchers to improve research culture and practice," PLOS Biology, Public Library of Science, vol. 20(7), pages 1-19, July.
    8. Tony Ross-Hellauer & Thomas Klebel & Petr Knoth & Nancy Pontika, 2024. "Value dissonance in research(er) assessment: individual and perceived institutional priorities in review, promotion, and tenure," Science and Public Policy, Oxford University Press, vol. 51(3), pages 337-351.
    9. Shinichi Nakagawa & Edward R. Ivimey-Cook & Matthew J. Grainger & Rose E. O’Dea & Samantha Burke & Szymon M. Drobniak & Elliot Gould & Erin L. Macartney & April Robin Martinig & Kyle Morrison & Matthi, 2023. "Method Reporting with Initials for Transparency (MeRIT) promotes more granularity and accountability for author contributions," Nature Communications, Nature, vol. 14(1), pages 1-5, December.
    10. Ana Cecilia Quiroga Gutierrez & Daniel J. Lindegger & Ala Taji Heravi & Thomas Stojanov & Martin Sykora & Suzanne Elayan & Stephen J. Mooney & John A. Naslund & Marta Fadda & Oliver Gruebner, 2023. "Reproducibility and Scientific Integrity of Big Data Research in Urban Public Health and Digital Epidemiology: A Call to Action," IJERPH, MDPI, vol. 20(2), pages 1-15, January.
    11. Ginevra Peruginelli & Janne Pölönen, 2024. "The legal foundation of responsible research assessment: An overview on European Union and Italy," Research Evaluation, Oxford University Press, vol. 32(4), pages 670-682.
    12. Yuki Yamada, 2021. "How to Protect the Credibility of Articles Published in Predatory Journals," Publications, MDPI, vol. 9(1), pages 1-8, January.
    13. Rosie Hastings & Krishma Labib & Iris Lechner & Lex Bouter & Guy Widdershoven & Natalie Evans, 2023. "Guidance on research integrity provided by pan-European discipline-specific learned societies: A scoping review," Science and Public Policy, Oxford University Press, vol. 50(2), pages 318-335.
    14. Alejandra Manco, 2022. "A Landscape of Open Science Policies Research," SAGE Open, , vol. 12(4), pages 21582440221, December.
    15. Adam J Kucharski & Sebastian Funk & Rosalind M Eggo, 2020. "The COVID-19 response illustrates that traditional academic reward structures and metrics do not reflect crucial contributions to modern science," PLOS Biology, Public Library of Science, vol. 18(10), pages 1-3, October.
    16. Gadd, Elizabeth, 2021. "Mis-measuring our universities: how global university rankings don't add up," SocArXiv gxbn5, Center for Open Science.
    17. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArchive 2b5k4_v1, Center for Open Science.
    18. Labib, Krishma, 2024. "Research integrity and research fairness: harmonious or in conflict?," OSF Preprints ygakx, Center for Open Science.
    19. repec:osf:socarx:gxbn5_v1 is not listed on IDEAS
    20. Noémie Aubert Bonn & Wim Pinxten, 2021. "Advancing science or advancing careers? Researchers’ opinions on success indicators," PLOS ONE, Public Library of Science, vol. 16(2), pages 1-17, February.
    21. Alexandra-Maria Klein & Nina Kranke, 2023. "Some thoughts on transparency of the data and analysis behind the Highly Cited Researchers list," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(12), pages 6773-6780, December.
    22. Alexander Schniedermann, 2021. "A comparison of systematic reviews and guideline-based systematic reviews in medical studies," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9829-9846, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0287660. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.