IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v97y2013i2d10.1007_s11192-013-1002-z.html
   My bibliography  Save this article

On peer review in computer science: analysis of its effectiveness and suggestions for improvement

Author

Listed:
  • Azzurra Ragone

    (University of Trento)

  • Katsiaryna Mirylenka

    (University of Trento)

  • Fabio Casati

    (University of Trento)

  • Maurizio Marchese

    (University of Trento)

Abstract

In this paper we focus on the analysis of peer reviews and reviewers behaviour in a number of different review processes. More specifically, we report on the development, definition and rationale of a theoretical model for peer review processes to support the identification of appropriate metrics to assess the processes main characteristics in order to render peer review more transparent and understandable. Together with known metrics and techniques we introduce new ones to assess the overall quality (i.e. ,reliability, fairness, validity) and efficiency of peer review processes e.g. the robustness of the process, the degree of agreement/disagreement among reviewers, or positive/negative bias in the reviewers’ decision making process. We also check the ability of peer review to assess the impact of papers in subsequent years. We apply the proposed model and analysis framework to a large reviews data set from ten different conferences in computer science for a total of ca. 9,000 reviews on ca. 2,800 submitted contributions. We discuss the implications of the results and their potential use toward improving the analysed peer review processes. A number of interesting results were found, in particular: (1) a low correlation between peer review outcome and impact in time of the accepted contributions; (2) the influence of the assessment scale on the way how reviewers gave marks; (3) the effect and impact of rating bias, i.e. reviewers who constantly give lower/higher marks w.r.t. all other reviewers; (4) the effectiveness of statistical approaches to optimize some process parameters (e.g. ,number of papers per reviewer) to improve the process overall quality while maintaining the overall effort under control. Based on the lessons learned, we suggest ways to improve the overall quality of peer-review through procedures that can be easily implemented in current editorial management systems.

Suggested Citation

  • Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
  • Handle: RePEc:spr:scient:v:97:y:2013:i:2:d:10.1007_s11192-013-1002-z
    DOI: 10.1007/s11192-013-1002-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-013-1002-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-013-1002-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Robert Ebel, 1951. "Estimation of the reliability of ratings," Psychometrika, Springer;The Psychometric Society, vol. 16(4), pages 407-424, December.
    2. Lutz Bornmann & Markus Wolf & Hans-Dieter Daniel, 2012. "Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 843-856, June.
    3. Martin Reinhart, 2009. "Peer review of grant applications in biology and medicine. Reliability, fairness, and validity," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 789-809, December.
    4. Xuemei Li & Mike Thelwall & Dean Giustini, 2012. "Validating online reference managers for scholarly impact measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 461-471, May.
    5. Lutz Bornmann & Hans-Dieter Daniel, 2010. "The validity of staff editors’ initial evaluations of manuscripts: a case study of Angewandte Chemie International Edition," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(3), pages 681-687, December.
    6. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees' decisions," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 297-320, April.
    7. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Committee peer review at an international research foundation: predictive validity and fairness of selection decisions on post-graduate fellowship applications," Research Evaluation, Oxford University Press, vol. 14(1), pages 15-20, April.
    8. Guillaume Cabanac & Thomas Preuss, 2013. "Capitalizing on order effects in the bids of peer-reviewed conferences to secure reviews by expert referees," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 405-415, February.
    9. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "How to detect indications of potential sources of bias in peer review: A generalized latent variable modeling approach exemplified by a gender study," Journal of Informetrics, Elsevier, vol. 2(4), pages 280-287.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    2. Mund, Carolin & Neuhäusler, Peter, 2015. "Towards an early-stage identification of emerging topics in science—The usability of bibliometric characteristics," Journal of Informetrics, Elsevier, vol. 9(4), pages 1018-1033.
    3. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
    4. Niccolò Casnici & Francisco Grimaldo & Nigel Gilbert & Pierpaolo Dondio & Flaminio Squazzoni, 2017. "Assessing peer review by gauging the fate of rejected manuscripts: the case of the Journal of Artificial Societies and Social Simulation," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 533-546, October.
    5. Flaminio Squazzoni & Elise Brezis & Ana Marušić, 2017. "Scientometrics of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 501-502, October.
    6. Marco Seeber & Alberto Bacchelli, 2017. "Does single blind peer review hinder newcomers?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 567-585, October.
    7. Monica Aniela Zaharie & Marco Seeber, 2018. "Are non-monetary rewards effective in attracting peer reviewers? A natural experiment," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1587-1609, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Marjolijn N. Wijnen & Jorg J. M. Massen & Mariska E. Kret, 2021. "Gender bias in the allocation of student grants," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5477-5488, July.
    2. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    3. Ben-David, Vered, 2016. "Substance-abusing parents and their children in termination of parental rights cases in Israel," Children and Youth Services Review, Elsevier, vol. 66(C), pages 94-100.
    4. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    5. Fei Shu, 2017. "Comment to: Does China need to rethink its metrics- and citation-based research rewards policies?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1229-1231, November.
    6. Yu Liu & Dan Lin & Xiujuan Xu & Shimin Shan & Quan Z. Sheng, 2018. "Multi-views on Nature Index of Chinese academic institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 823-837, March.
    7. Paul Hoffman, 1963. "Test reliability and practice effects," Psychometrika, Springer;The Psychometric Society, vol. 28(3), pages 273-288, September.
    8. Kim, Yusoon & Choi, Thomas Y., 2021. "Supplier relationship strategies and outcome dualities: An empirical study of embeddedness perspective," International Journal of Production Economics, Elsevier, vol. 232(C).
    9. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 0. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 0, pages 1-19.
    10. Amin Mazloumian, 2012. "Predicting Scholars' Scientific Impact," PLOS ONE, Public Library of Science, vol. 7(11), pages 1-5, November.
    11. Mingyang Wang & Zhenyu Wang & Guangsheng Chen, 2019. "Which can better predict the future success of articles? Bibliometric indices or alternative metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1575-1595, June.
    12. Balázs Győrffy & Andrea Magda Nagy & Péter Herman & Ádám Török, 2018. "Factors influencing the scientific performance of Momentum grant holders: an evaluation of the first 117 research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 409-426, October.
    13. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    14. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Gatekeepers of science—Effects of external reviewers’ attributes on the assessments of fellowship applications," Journal of Informetrics, Elsevier, vol. 1(1), pages 83-91.
    15. Wenya Huang & Peiling Wang & Qiang Wu, 2018. "A correlation comparison between Altmetric Attention Scores and citations for six PLOS journals," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-15, April.
    16. Mike Thelwall & Stefanie Haustein & Vincent Larivière & Cassidy R Sugimoto, 2013. "Do Altmetrics Work? Twitter and Ten Other Social Web Services," PLOS ONE, Public Library of Science, vol. 8(5), pages 1-7, May.
    17. Mingkun Wei & Abdolreza Noroozi Chakoli, 2020. "Evaluating the relationship between the academic and social impact of open access books based on citation behaviors and social media attention," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2401-2420, December.
    18. Katarína Cechlárová & Tamás Fleiner & Eva Potpinková, 2014. "Assigning evaluators to research grant applications: the case of Slovak Research and Development Agency," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(2), pages 495-506, May.
    19. Walter Kristof, 1963. "Statistical inferences about the error variance," Psychometrika, Springer;The Psychometric Society, vol. 28(2), pages 129-143, June.
    20. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:97:y:2013:i:2:d:10.1007_s11192-013-1002-z. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.