IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v91y2012i3d10.1007_s11192-011-0569-5.html
   My bibliography  Save this article

Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?

Author

Listed:
  • Lutz Bornmann

    () (Max Planck Society, Administrative Headquarters)

  • Markus Wolf

    () (University Hospital Heidelberg)

  • Hans-Dieter Daniel

    (University of Zurich
    ETH Zurich)

Abstract

Abstract Whereas in traditional, closed peer review (CPR) a few, selected scientists (peers) are included in the process of manuscript review, public peer review (PPR) includes, in addition to invited reviewers, a wider circle of scientists who are interested in a manuscript and wish to write a comment on it. In this study, using the data of two comprehensive evaluation studies on the CPR process at Angewandte Chemie—International Edition and the PPR process at Atmospheric Chemistry and Physics, we examined the language characteristics in comments that were written by invited reviewers in CPR and by invited reviewers and interested members of the scientific community in PPR. We used Linguistic Inquiry and Word Count (LIWC), a text analysis software program that counts words in meaningful categories (e.g., positive or negative emotions) using a standardized dictionary. We examined 599 comments from the reviews of 229 manuscripts. The results show that the comments in PPR are much longer than the comments in CPR. This is an indication that PPR reviewing has more of an improvement function and CPR reviewing has more of a selection function. The results also show that CPR is not, as might be expected, more susceptible to the expression of negative emotions than PPR is. On the contrary, positive emotion words are used statistically significantly more frequently in CPR than in PPR.

Suggested Citation

  • Lutz Bornmann & Markus Wolf & Hans-Dieter Daniel, 2012. "Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 843-856, June.
  • Handle: RePEc:spr:scient:v:91:y:2012:i:3:d:10.1007_s11192-011-0569-5
    DOI: 10.1007/s11192-011-0569-5
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-011-0569-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Marx, Werner & Schier, Hermann & Rahm, Erhard & Thor, Andreas & Daniel, Hans-Dieter, 2009. "Convergent validity of bibliometric Google Scholar data in the field of chemistry—Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published els," Journal of Informetrics, Elsevier, vol. 3(1), pages 27-35.
    2. James Hartley & James W. Pennebaker & Claire Fox, 2003. "Abstracts, introductions and discussions: How far do they differ in style?," Scientometrics, Springer;Akadémiai Kiadó, vol. 57(3), pages 389-398, July.
    3. Lutz Bornmann & Irina Nast & Hans-Dieter Daniel, 2008. "Do editors and referees look for signs of scientific misconduct when reviewing manuscripts? A quantitative content analysis of studies that examined review criteria and reasons for accepting and rejec," Scientometrics, Springer;Akadémiai Kiadó, vol. 77(3), pages 415-432, December.
    4. Lutz Bornmann & Christoph Neuhaus & Hans-Dieter Daniel, 2011. "The effect of a two-stage publication process on the Journal Impact Factor: a case study on the interactive open access journal Atmospheric Chemistry and Physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 86(1), pages 93-97, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Cassidy R. Sugimoto & Blaise Cronin, 2013. "Citation gamesmanship: testing for evidence of ego bias in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(3), pages 851-862, June.
    2. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:91:y:2012:i:3:d:10.1007_s11192-011-0569-5. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sonal Shukla) or (Springer Nature Abstracting and Indexing). General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.