IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/7dc6a.html
   My bibliography  Save this paper

Quality assessment of scientific manuscripts in peer review and education

Author

Listed:
  • Augusteijn, Hilde Elisabeth Maria

    (Tilburg University)

  • Wicherts, Jelte M.

    (Tilburg University)

  • Sijtsma, Klaas
  • van Assen, Marcel A. L. M.

Abstract

We report a vignette study and a survey to investigate which study characteristics influence quality ratings academics give of articles submitted for publication, and academics and students give of students’ theses. In the vignette study, 800 respondents evaluated the quality of an abstract of studies with small or large sample sizes, showing statistically significant or non-significant results, and containing statistical reporting errors or no errors. In the survey, the same participants rated the importance of 29 manuscript characteristics related to the study’s theory, design, conduct, data analyses, and presentation for assessing either the quality of a manuscript or its publishability (article) or grade (thesis). Results showed that quality ratings were affected by sample sizes but not by statistical significance or the presence of statistical reporting errors in the rated research vignette. These results suggest that researchers’ assessments of manuscript quality are not responsible for publication bias. Furthermore, academics and students provided highly similar ratings of the importance of different aspects relevant to quality assessment of articles and theses. These results suggest that quality criteria for scientific manuscripts are already adopted by students and are similar for submitted manuscripts and theses.

Suggested Citation

  • Augusteijn, Hilde Elisabeth Maria & Wicherts, Jelte M. & Sijtsma, Klaas & van Assen, Marcel A. L. M., 2023. "Quality assessment of scientific manuscripts in peer review and education," OSF Preprints 7dc6a, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:7dc6a
    DOI: 10.31219/osf.io/7dc6a
    as

    Download full text from publisher

    File URL: https://osf.io/download/63b40dc2e48ccc08404fdc77/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/7dc6a?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel, 2010. "A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants," PLOS ONE, Public Library of Science, vol. 5(12), pages 1-10, December.
    2. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    3. Denes Szucs & John P A Ioannidis, 2017. "Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature," PLOS Biology, Public Library of Science, vol. 15(3), pages 1-18, March.
    4. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2012. "Heterogeneity of Inter-Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-10, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    2. Rüdiger Mutz & Tobias Wolbring & Hans-Dieter Daniel, 2017. "The effect of the “very important paper” (VIP) designation in Angewandte Chemie International Edition on citation impact: A propensity score matching analysis," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2139-2153, September.
    3. Patrícia Martinková & Dan Goldhaber & Elena Erosheva, 2018. "Disparities in ratings of internal and external applicants: A case for model-based inter-rater reliability," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-17, October.
    4. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    5. Grażyna Wieczorkowska & Katarzyna Kowalczyk, 2021. "Ensuring Sustainable Evaluation: How to Improve Quality of Evaluating Grant Proposals?," Sustainability, MDPI, vol. 13(5), pages 1-11, March.
    6. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    7. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    8. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    9. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    10. Steven Wooding & Thed N Van Leeuwen & Sarah Parks & Shitij Kapur & Jonathan Grant, 2015. "UK Doubles Its “World-Leading” Research in Life Sciences and Medicine in Six Years: Testing the Claim?," PLOS ONE, Public Library of Science, vol. 10(7), pages 1-10, July.
    11. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication Success under Questionable Research Practices - A Simulation Study," I4R Discussion Paper Series 2, The Institute for Replication (I4R).
    12. Laura Muñoz-Bermejo & Jorge Pérez-Gómez & Fernando Manzano & Daniel Collado-Mateo & Santos Villafaina & José C Adsuar, 2019. "Reliability of isokinetic knee strength measurements in children: A systematic review and meta-analysis," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-15, December.
    13. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    14. David A. M. Peterson, 2020. "Dear Reviewer 2: Go F’ Yourself," Social Science Quarterly, Southwestern Social Science Association, vol. 101(4), pages 1648-1652, July.
    15. Vincent Chandler, 2019. "Identifying emerging scholars: seeing through the crystal ball of scholarship selection committees," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 39-56, July.
    16. Kathryn N. Vasilaky & J. Michelle Brock, 2020. "Power(ful) guidelines for experimental economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(2), pages 189-212, December.
    17. Shaw, Steven D. & Nave, Gideon, 2023. "Don't hate the player, hate the game: Realigning incentive structures to promote robust science and better scientific practices in marketing," Journal of Business Research, Elsevier, vol. 167(C).
    18. Lei Li & Yan Wang & Guanfeng Liu & Meng Wang & Xindong Wu, 2015. "Context-Aware Reviewer Assignment for Trust Enhanced Peer Review," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-28, June.
    19. Pengfei Jia & Weixi Xie & Guangyao Zhang & Xianwen Wang, 2023. "Do reviewers get their deserved acknowledgments from the authors of manuscripts?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5687-5703, October.
    20. Schweinsberg, Martin & Feldman, Michael & Staub, Nicola & van den Akker, Olmo R. & van Aert, Robbie C.M. & van Assen, Marcel A.L.M. & Liu, Yang & Althoff, Tim & Heer, Jeffrey & Kale, Alex & Mohamed, Z, 2021. "Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis," Organizational Behavior and Human Decision Processes, Elsevier, vol. 165(C), pages 228-249.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:7dc6a. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.