IDEAS home Printed from https://ideas.repec.org/a/pal/palcom/v8y2021i1d10.1057_s41599-020-00703-8.html
   My bibliography  Save this article

AI-assisted peer review

Author

Listed:
  • Alessandro Checco

    (The University of Sheffield)

  • Lorenzo Bracciale

    (University of Rome Tor Vergata)

  • Pierpaolo Loreti

    (University of Rome Tor Vergata)

  • Stephen Pinfield

    (The University of Sheffield)

  • Giuseppe Bianchi

    (University of Rome Tor Vergata)

Abstract

The scientific literature peer review workflow is under strain because of the constant growth of submission volume. One response to this is to make initial screening of submissions less time intensive. Reducing screening and review time would save millions of working hours and potentially boost academic productivity. Many platforms have already started to use automated screening tools, to prevent plagiarism and failure to respect format requirements. Some tools even attempt to flag the quality of a study or summarise its content, to reduce reviewers’ load. The recent advances in artificial intelligence (AI) create the potential for (semi) automated peer review systems, where potentially low-quality or controversial studies could be flagged, and reviewer-document matching could be performed in an automated manner. However, there are ethical concerns, which arise from such approaches, particularly associated with bias and the extent to which AI systems may replicate bias. Our main goal in this study is to discuss the potential, pitfalls, and uncertainties of the use of AI to approximate or assist human decisions in the quality assurance and peer-review process associated with research outputs. We design an AI tool and train it with 3300 papers from three conferences, together with their reviews evaluations. We then test the ability of the AI in predicting the review score of a new, unobserved manuscript, only using its textual content. We show that such techniques can reveal correlations between the decision process and other quality proxy measures, uncovering potential biases of the review process. Finally, we discuss the opportunities, but also the potential unintended consequences of these techniques in terms of algorithmic bias and ethical concerns.

Suggested Citation

  • Alessandro Checco & Lorenzo Bracciale & Pierpaolo Loreti & Stephen Pinfield & Giuseppe Bianchi, 2021. "AI-assisted peer review," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-11, December.
  • Handle: RePEc:pal:palcom:v:8:y:2021:i:1:d:10.1057_s41599-020-00703-8
    DOI: 10.1057/s41599-020-00703-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1057/s41599-020-00703-8
    File Function: Abstract
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1057/s41599-020-00703-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Adrian Mulligan & Louise Hall & Ellen Raphael, 2013. "Peer review in a changing world: An international study measuring the attitudes of researchers," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 132-161, January.
    2. Adrian Mulligan & Louise Hall & Ellen Raphael, 2013. "Peer review in a changing world: An international study measuring the attitudes of researchers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 132-161, January.
    3. S. P. J. M. Horbach & W. Halffman, 2019. "The ability of different peer review procedures to flag problematic publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 339-373, January.
    4. David Cyranoski, 2019. "Artificial intelligence is selecting grant reviewers in China," Nature, Nature, vol. 569(7756), pages 316-317, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Gendron, Yves & Andrew, Jane & Cooper, Christine, 2022. "The perils of artificial intelligence in academic publishing," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 87(C).
    2. Howell, Bronwyn E. & Potgieter, Petrus H., 2023. "AI-generated lemons: a sour outlook for content producers?," 32nd European Regional ITS Conference, Madrid 2023: Realising the digital decade in the European Union – Easier said than done? 277971, International Telecommunications Society (ITS).
    3. Rovetta, Alessandro & Castaldo, Lucia, 2022. "Are We Sure We Fully Understand What an Infodemic Is? A Global Perspective on Infodemiological Problems," SocArXiv xw723, Center for Open Science.
    4. Hajkowicz, Stefan & Naughtin, Claire & Sanderson, Conrad & Schleiger, Emma & Karimi, Sarvnaz & Bratanova, Alexandra & Bednarz, Tomasz, 2022. "Artificial intelligence for science – adoption trends and future development pathways," MPRA Paper 115464, University Library of Munich, Germany.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. J. A. Garcia & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia, 2021. "The interplay between the reviewer’s incentives and the journal’s quality standard," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3041-3061, April.
    2. Bianchi, Federico & Grimaldo, Francisco & Squazzoni, Flaminio, 2019. "The F3-index. Valuing reviewers for scholarly journals," Journal of Informetrics, Elsevier, vol. 13(1), pages 78-86.
    3. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    4. Michail Kovanis & Ludovic Trinquart & Philippe Ravaud & Raphaël Porcher, 2017. "Evaluating alternative systems of peer review: a large-scale agent-based modelling approach to scientific publication," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 651-671, October.
    5. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    6. Maciej J Mrowinski & Piotr Fronczak & Agata Fronczak & Marcel Ausloos & Olgica Nedic, 2017. "Artificial intelligence in peer review: How can evolutionary computation support journal editors?," PLOS ONE, Public Library of Science, vol. 12(9), pages 1-11, September.
    7. Narjes Vara & Mahdieh Mirzabeigi & Hajar Sotudeh & Seyed Mostafa Fakhrahmad, 2022. "Application of k-means clustering algorithm to improve effectiveness of the results recommended by journal recommender system," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3237-3252, June.
    8. Katarina Krapež, 2022. "Advancing Self-Evaluative and Self-Regulatory Mechanisms of Scholarly Journals: Editors’ Perspectives on What Needs to Be Improved in the Editorial Process," Publications, MDPI, vol. 10(1), pages 1-18, March.
    9. Abbie Griffin & Gloria Barczak, 2020. "Effective reviewing for conceptual journal submissions," AMS Review, Springer;Academy of Marketing Science, vol. 10(1), pages 36-48, June.
    10. Michail Kovanis & Raphaël Porcher & Philippe Ravaud & Ludovic Trinquart, 2016. "Complex systems approach to scientific publication and peer-review system: development of an agent-based model calibrated with empirical journal data," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 695-715, February.
    11. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    12. Kuklin, Alexander A. (Куклин, Александр) & Balyakina, Evgeniya A. (Балякина, Евгения), 2017. "Active policy as a key to success for an International Economic Periodical [Активная Политика — Залог Успеха Международного Экономического Журнала]," Ekonomicheskaya Politika / Economic Policy, Russian Presidential Academy of National Economy and Public Administration, vol. 6, pages 160-177, December.
    13. Vivian M Nguyen & Neal R Haddaway & Lee F G Gutowsky & Alexander D M Wilson & Austin J Gallagher & Michael R Donaldson & Neil Hammerschlag & Steven J Cooke, 2015. "How Long Is Too Long in Contemporary Peer Review? Perspectives from Authors Publishing in Conservation Biology Journals," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-20, August.
    14. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    15. Paul Sebo & Jean Pascal Fournier & Claire Ragot & Pierre-Henri Gorioux & François R. Herrmann & Hubert Maisonneuve, 2019. "Factors associated with publication speed in general medical journals: a retrospective study of bibliometric data," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1037-1058, May.
    16. Eirini Delikoura & Dimitrios Kouis, 2021. "Open Research Data and Open Peer Review: Perceptions of a Medical and Health Sciences Community in Greece," Publications, MDPI, vol. 9(2), pages 1-19, March.
    17. Pengfei Jia & Weixi Xie & Guangyao Zhang & Xianwen Wang, 2023. "Do reviewers get their deserved acknowledgments from the authors of manuscripts?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5687-5703, October.
    18. J. Israel Martínez-López & Samantha Barrón-González & Alejandro Martínez López, 2019. "Which Are the Tools Available for Scholars? A Review of Assisting Software for Authors during Peer Reviewing Process," Publications, MDPI, vol. 7(3), pages 1-28, September.
    19. Michail Kovanis & Raphaël Porcher & Philippe Ravaud & Ludovic Trinquart, 2016. "The Global Burden of Journal Peer Review in the Biomedical Literature: Strong Imbalance in the Collective Enterprise," PLOS ONE, Public Library of Science, vol. 11(11), pages 1-14, November.
    20. Maciej J. Mrowinski & Agata Fronczak & Piotr Fronczak & Olgica Nedic & Marcel Ausloos, 2016. "Review time in peer review: quantitative analysis and modelling of editorial workflows," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(1), pages 271-286, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pal:palcom:v:8:y:2021:i:1:d:10.1057_s41599-020-00703-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: https://www.nature.com/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.