IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v91y2012i3d10.1007_s11192-011-0569-5.html
   My bibliography  Save this article

Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?

Author

Listed:
  • Lutz Bornmann

    (Max Planck Society, Administrative Headquarters)

  • Markus Wolf

    (University Hospital Heidelberg)

  • Hans-Dieter Daniel

    (University of Zurich
    ETH Zurich)

Abstract

Whereas in traditional, closed peer review (CPR) a few, selected scientists (peers) are included in the process of manuscript review, public peer review (PPR) includes, in addition to invited reviewers, a wider circle of scientists who are interested in a manuscript and wish to write a comment on it. In this study, using the data of two comprehensive evaluation studies on the CPR process at Angewandte Chemie—International Edition and the PPR process at Atmospheric Chemistry and Physics, we examined the language characteristics in comments that were written by invited reviewers in CPR and by invited reviewers and interested members of the scientific community in PPR. We used Linguistic Inquiry and Word Count (LIWC), a text analysis software program that counts words in meaningful categories (e.g., positive or negative emotions) using a standardized dictionary. We examined 599 comments from the reviews of 229 manuscripts. The results show that the comments in PPR are much longer than the comments in CPR. This is an indication that PPR reviewing has more of an improvement function and CPR reviewing has more of a selection function. The results also show that CPR is not, as might be expected, more susceptible to the expression of negative emotions than PPR is. On the contrary, positive emotion words are used statistically significantly more frequently in CPR than in PPR.

Suggested Citation

  • Lutz Bornmann & Markus Wolf & Hans-Dieter Daniel, 2012. "Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 843-856, June.
  • Handle: RePEc:spr:scient:v:91:y:2012:i:3:d:10.1007_s11192-011-0569-5
    DOI: 10.1007/s11192-011-0569-5
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-011-0569-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-011-0569-5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Marx, Werner & Schier, Hermann & Rahm, Erhard & Thor, Andreas & Daniel, Hans-Dieter, 2009. "Convergent validity of bibliometric Google Scholar data in the field of chemistry—Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published els," Journal of Informetrics, Elsevier, vol. 3(1), pages 27-35.
    2. James Hartley & James W. Pennebaker & Claire Fox, 2003. "Abstracts, introductions and discussions: How far do they differ in style?," Scientometrics, Springer;Akadémiai Kiadó, vol. 57(3), pages 389-398, July.
    3. Lutz Bornmann & Irina Nast & Hans-Dieter Daniel, 2008. "Do editors and referees look for signs of scientific misconduct when reviewing manuscripts? A quantitative content analysis of studies that examined review criteria and reasons for accepting and rejec," Scientometrics, Springer;Akadémiai Kiadó, vol. 77(3), pages 415-432, December.
    4. Lutz Bornmann & Christoph Neuhaus & Hans-Dieter Daniel, 2011. "The effect of a two-stage publication process on the Journal Impact Factor: a case study on the interactive open access journal Atmospheric Chemistry and Physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 86(1), pages 93-97, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    2. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 0. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 0, pages 1-19.
    3. Cassidy R. Sugimoto & Blaise Cronin, 2013. "Citation gamesmanship: testing for evidence of ego bias in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(3), pages 851-862, June.
    4. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    5. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    6. Sahar Vahdati & Said Fathalla & Christoph Lange & Andreas Behrend & Aysegul Say & Zeynep Say & Sören Auer, 2021. "A comprehensive quality assessment framework for scientific events," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 641-682, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Meva Bayrak Karsli & Sinem Karabey & Nergiz Ercil Cagiltay & Yuksel Goktas, 2018. "Comparison of the discussion sections of PhD dissertations in educational technology: the case of Turkey and the USA," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1381-1403, December.
    2. Jerome K. Vanclay, 2012. "Impact factor: outdated artefact or stepping-stone to journal certification?," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 211-238, August.
    3. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    4. Martin-Martin, Alberto & Orduna-Malea, Enrique & Harzing, Anne-Wil & Delgado López-Cózar, Emilio, 2017. "Can we use Google Scholar to identify highly-cited documents?," Journal of Informetrics, Elsevier, vol. 11(1), pages 152-163.
    5. Edoardo Magnone, 2014. "A novel graphical representation of sentence complexity: the description and its application," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1301-1329, February.
    6. Lutz Bornmann & Werner Marx, 2012. "The effect of several versions of one and the same manuscript published by a journal on its journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 277-279, August.
    7. Fiorenzo Franceschini & Domenico Maisano, 2011. "Bibliometric positioning of scientific manufacturing journals: a comparative analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 86(2), pages 463-485, February.
    8. Embiya Celik & Nuray Gedik & Güler Karaman & Turgay Demirel & Yuksel Goktas, 2014. "Mistakes encountered in manuscripts on education and their effects on journal rejections," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(3), pages 1837-1853, March.
    9. Louis Mesnard, 2010. "On Hochberg et al.’s “The tragedy of the reviewer commons”," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(3), pages 903-917, September.
    10. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
    11. J. W. Fedderke, 2013. "The objectivity of national research foundation peer review in South Africa assessed against bibliometric indexes," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 177-206, November.
    12. Mingyang Wang & Jiaqi Zhang & Guangsheng Chen & Kah-Hin Chai, 2019. "Examining the influence of open access on journals’ citation obsolescence by modeling the actual citation process," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1621-1641, June.
    13. Loet Leydesdorff & Daniele Rotolo & Ismael Rafols, 2012. "Bibliometric perspectives on medical innovation using the medical subject Headings of PubMed," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(11), pages 2239-2253, November.
    14. Joost C. F. Winter & Amir A. Zadpoor & Dimitra Dodou, 2014. "The expansion of Google Scholar versus Web of Science: a longitudinal study," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1547-1565, February.
    15. Loet Leydesdorff & Jordan A. Comins & Aaron A. Sorensen & Lutz Bornmann & Iina Hellsten, 2016. "Cited references and Medical Subject Headings (MeSH) as two different knowledge representations: clustering and mappings at the paper level," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2077-2091, December.
    16. Naser Rashidi & Hussein Meihami, 2018. "Informetrics of Scientometrics abstracts: a rhetorical move analysis of the research abstracts published in Scientometrics journal," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 1975-1994, September.
    17. Dowling, Michael & Hammami, Helmi & Zreik, Ousayna, 2018. "Easy to read, easy to cite?," Economics Letters, Elsevier, vol. 173(C), pages 100-103.
    18. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    19. Christos Alexakis & Michael Dowling & Konstantinos Eleftheriou & Michael Polemis, 2021. "Textual Machine Learning: An Application to Computational Economics Research," Computational Economics, Springer;Society for Computational Economics, vol. 57(1), pages 369-385, January.
    20. Lutz Bornmann & Christophe Weymuth & Hans-Dieter Daniel, 2010. "A content analysis of referees’ comments: how do comments on manuscripts rejected by a high-impact journal and later published in either a low- or high-impact journal differ?," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(2), pages 493-506, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:91:y:2012:i:3:d:10.1007_s11192-011-0569-5. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.