IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v97y2013i2d10.1007_s11192-013-1002-z.html
   My bibliography  Save this article

On peer review in computer science: analysis of its effectiveness and suggestions for improvement

Author

Listed:
  • Azzurra Ragone

    (University of Trento)

  • Katsiaryna Mirylenka

    (University of Trento)

  • Fabio Casati

    (University of Trento)

  • Maurizio Marchese

    (University of Trento)

Abstract

In this paper we focus on the analysis of peer reviews and reviewers behaviour in a number of different review processes. More specifically, we report on the development, definition and rationale of a theoretical model for peer review processes to support the identification of appropriate metrics to assess the processes main characteristics in order to render peer review more transparent and understandable. Together with known metrics and techniques we introduce new ones to assess the overall quality (i.e. ,reliability, fairness, validity) and efficiency of peer review processes e.g. the robustness of the process, the degree of agreement/disagreement among reviewers, or positive/negative bias in the reviewers’ decision making process. We also check the ability of peer review to assess the impact of papers in subsequent years. We apply the proposed model and analysis framework to a large reviews data set from ten different conferences in computer science for a total of ca. 9,000 reviews on ca. 2,800 submitted contributions. We discuss the implications of the results and their potential use toward improving the analysed peer review processes. A number of interesting results were found, in particular: (1) a low correlation between peer review outcome and impact in time of the accepted contributions; (2) the influence of the assessment scale on the way how reviewers gave marks; (3) the effect and impact of rating bias, i.e. reviewers who constantly give lower/higher marks w.r.t. all other reviewers; (4) the effectiveness of statistical approaches to optimize some process parameters (e.g. ,number of papers per reviewer) to improve the process overall quality while maintaining the overall effort under control. Based on the lessons learned, we suggest ways to improve the overall quality of peer-review through procedures that can be easily implemented in current editorial management systems.

Suggested Citation

  • Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
  • Handle: RePEc:spr:scient:v:97:y:2013:i:2:d:10.1007_s11192-013-1002-z
    DOI: 10.1007/s11192-013-1002-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-013-1002-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-013-1002-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lutz Bornmann & Markus Wolf & Hans-Dieter Daniel, 2012. "Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 843-856, June.
    2. Guillaume Cabanac & Thomas Preuss, 2013. "Capitalizing on order effects in the bids of peer‐reviewed conferences to secure reviews by expert referees," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 405-415, February.
    3. Xuemei Li & Mike Thelwall & Dean Giustini, 2012. "Validating online reference managers for scholarly impact measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 461-471, May.
    4. Lutz Bornmann & Hans-Dieter Daniel, 2010. "The validity of staff editors’ initial evaluations of manuscripts: a case study of Angewandte Chemie International Edition," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(3), pages 681-687, December.
    5. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Committee peer review at an international research foundation: predictive validity and fairness of selection decisions on post-graduate fellowship applications," Research Evaluation, Oxford University Press, vol. 14(1), pages 15-20, April.
    6. Guillaume Cabanac & Thomas Preuss, 2013. "Capitalizing on order effects in the bids of peer-reviewed conferences to secure reviews by expert referees," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 405-415, February.
    7. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "How to detect indications of potential sources of bias in peer review: A generalized latent variable modeling approach exemplified by a gender study," Journal of Informetrics, Elsevier, vol. 2(4), pages 280-287.
    8. Robert Ebel, 1951. "Estimation of the reliability of ratings," Psychometrika, Springer;The Psychometric Society, vol. 16(4), pages 407-424, December.
    9. Christine Wennerås & Agnes Wold, 1997. "Nepotism and sexism in peer-review," Nature, Nature, vol. 387(6631), pages 341-343, May.
    10. Martin Reinhart, 2009. "Peer review of grant applications in biology and medicine. Reliability, fairness, and validity," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 789-809, December.
    11. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees' decisions," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 297-320, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    2. Mund, Carolin & Neuhäusler, Peter, 2015. "Towards an early-stage identification of emerging topics in science—The usability of bibliometric characteristics," Journal of Informetrics, Elsevier, vol. 9(4), pages 1018-1033.
    3. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
    4. Niccolò Casnici & Francisco Grimaldo & Nigel Gilbert & Pierpaolo Dondio & Flaminio Squazzoni, 2017. "Assessing peer review by gauging the fate of rejected manuscripts: the case of the Journal of Artificial Societies and Social Simulation," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 533-546, October.
    5. Flaminio Squazzoni & Elise Brezis & Ana Marušić, 2017. "Scientometrics of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 501-502, October.
    6. Marco Seeber & Alberto Bacchelli, 2017. "Does single blind peer review hinder newcomers?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 567-585, October.
    7. Monica Aniela Zaharie & Marco Seeber, 2018. "Are non-monetary rewards effective in attracting peer reviewers? A natural experiment," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1587-1609, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2015. "Testing for the fairness and predictive validity of research funding decisions: A multilevel multiple imputation for missing data approach using ex-ante and ex-post peer evaluation data from the Austr," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2321-2339, November.
    2. Materia, V.C. & Pascucci, S. & Kolympiris, C., 2015. "Understanding the selection processes of public research projects in agriculture: The role of scientific merit," Food Policy, Elsevier, vol. 56(C), pages 87-99.
    3. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    4. Marjolijn N. Wijnen & Jorg J. M. Massen & Mariska E. Kret, 2021. "Gender bias in the allocation of student grants," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5477-5488, July.
    5. Ulf Sandström & Martin Hällsten, 2008. "Persistent nepotism in peer-review," Scientometrics, Springer;Akadémiai Kiadó, vol. 74(2), pages 175-189, February.
    6. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    7. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    8. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "Latent Markov modeling applied to grant peer review," Journal of Informetrics, Elsevier, vol. 2(3), pages 217-228.
    9. Qurat-ul Ain & Hira Riaz & Muhammad Tanvir Afzal, 2019. "Evaluation of h-index and its citation intensity based variants in the field of mathematics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 187-211, April.
    10. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "How to detect indications of potential sources of bias in peer review: A generalized latent variable modeling approach exemplified by a gender study," Journal of Informetrics, Elsevier, vol. 2(4), pages 280-287.
    11. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2007. "Gender differences in grant peer review: A meta-analysis," Journal of Informetrics, Elsevier, vol. 1(3), pages 226-238.
    12. Pardeep Sud & Mike Thelwall, 2014. "Evaluating altmetrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1131-1143, February.
    13. Squazzoni, Flaminio & Gandelli, Claudio, 2012. "Saint Matthew strikes again: An agent-based model of peer review and the scientific community structure," Journal of Informetrics, Elsevier, vol. 6(2), pages 265-275.
    14. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    15. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    16. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    17. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Gatekeepers of science—Effects of external reviewers’ attributes on the assessments of fellowship applications," Journal of Informetrics, Elsevier, vol. 1(1), pages 83-91.
    18. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    19. Marsh, Herbert W. & Jayasinghe, Upali W. & Bond, Nigel W., 2011. "Gender differences in peer reviews of grant applications: A substantive-methodological synergy in support of the null hypothesis model," Journal of Informetrics, Elsevier, vol. 5(1), pages 167-180.
    20. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:97:y:2013:i:2:d:10.1007_s11192-013-1002-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.