IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/7dc6a.html
   My bibliography  Save this paper

Quality assessment of scientific manuscripts in peer review and education

Author

Listed:
  • Augusteijn, Hilde Elisabeth Maria

    (Tilburg University)

  • Wicherts, Jelte M.

    (Tilburg University)

  • Sijtsma, Klaas
  • van Assen, Marcel A. L. M.

Abstract

We report a vignette study and a survey to investigate which study characteristics influence quality ratings academics give of articles submitted for publication, and academics and students give of students’ theses. In the vignette study, 800 respondents evaluated the quality of an abstract of studies with small or large sample sizes, showing statistically significant or non-significant results, and containing statistical reporting errors or no errors. In the survey, the same participants rated the importance of 29 manuscript characteristics related to the study’s theory, design, conduct, data analyses, and presentation for assessing either the quality of a manuscript or its publishability (article) or grade (thesis). Results showed that quality ratings were affected by sample sizes but not by statistical significance or the presence of statistical reporting errors in the rated research vignette. These results suggest that researchers’ assessments of manuscript quality are not responsible for publication bias. Furthermore, academics and students provided highly similar ratings of the importance of different aspects relevant to quality assessment of articles and theses. These results suggest that quality criteria for scientific manuscripts are already adopted by students and are similar for submitted manuscripts and theses.

Suggested Citation

  • Augusteijn, Hilde Elisabeth Maria & Wicherts, Jelte M. & Sijtsma, Klaas & van Assen, Marcel A. L. M., 2023. "Quality assessment of scientific manuscripts in peer review and education," OSF Preprints 7dc6a, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:7dc6a
    DOI: 10.31219/osf.io/7dc6a
    as

    Download full text from publisher

    File URL: https://osf.io/download/63b40dc2e48ccc08404fdc77/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/7dc6a?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2012. "Heterogeneity of Inter-Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-10, October.
    2. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    3. Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel, 2010. "A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants," PLOS ONE, Public Library of Science, vol. 5(12), pages 1-10, December.
    4. Denes Szucs & John P A Ioannidis, 2017. "Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature," PLOS Biology, Public Library of Science, vol. 15(3), pages 1-18, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. repec:osf:osfxxx:7dc6a_v1 is not listed on IDEAS
    2. Rüdiger Mutz & Tobias Wolbring & Hans-Dieter Daniel, 2017. "The effect of the “very important paper” (VIP) designation in Angewandte Chemie International Edition on citation impact: A propensity score matching analysis," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2139-2153, September.
    3. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    4. repec:plo:pone00:0061401 is not listed on IDEAS
    5. Kleber Neves & Pedro B Tan & Olavo B Amaral, 2022. "Are most published research findings false in a continuous universe?," PLOS ONE, Public Library of Science, vol. 17(12), pages 1-18, December.
    6. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    7. Patrícia Martinková & František Bartoš & Marek Brabec, 2023. "Assessing Inter-rater Reliability With Heterogeneous Variance Components Models: Flexible Approach Accounting for Contextual Variables," Journal of Educational and Behavioral Statistics, , vol. 48(3), pages 349-383, June.
    8. Bradford Demarest & Guo Freeman & Cassidy R. Sugimoto, 2014. "The reviewer in the mirror: examining gendered and ethnicized notions of reciprocity in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 717-735, October.
    9. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    10. Myanca Rodrigues & Jordan Edwards & Tea Rosic & Yanchen Wang & Jhalok Ronjan Talukdar & Saifur R Chowdhury & Sameer Parpia & Glenda Babe & Claire de Oliveira & Richard Perez & Zainab Samaan & Lehana T, 2025. "A tutorial on the what, why, and how of Bayesian analysis: Estimating mood and anxiety disorder prevalence using a Canadian data linkage study," PLOS Mental Health, Public Library of Science, vol. 2(2), pages 1-25, February.
    11. Maximilian M Mandl & Sabine Hoffmann & Sebastian Bieringer & Anna E Jacob & Marie Kraft & Simon Lemster & Anne-Laure Boulesteix, 2024. "Raising awareness of uncertain choices in empirical data analysis: A teaching concept toward replicable research practices," PLOS Computational Biology, Public Library of Science, vol. 20(3), pages 1-10, March.
    12. Patrícia Martinková & Dan Goldhaber & Elena Erosheva, 2018. "Disparities in ratings of internal and external applicants: A case for model-based inter-rater reliability," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-17, October.
    13. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    14. repec:osf:metaar:s4b65_v1 is not listed on IDEAS
    15. Müge Simsek & Mathijs Vaan & Arnout Rijt, 2024. "Do grant proposal texts matter for funding decisions? A field experiment," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(5), pages 2521-2532, May.
    16. repec:osf:osfxxx:gv25c_v1 is not listed on IDEAS
    17. Rinne, Sonja, 2024. "Estimating the merit-order effect using coarsened exact matching: Reconciling theory with the empirical results to improve policy implications," Energy Policy, Elsevier, vol. 185(C).
    18. Peter Vaz da Fonseca & Andrea Decourt Savelli & Michele Nascimento Juca, 2020. "A Systematic Review of the Influence of Taxation on Corporate Capital Structure," International Journal of Economics & Business Administration (IJEBA), International Journal of Economics & Business Administration (IJEBA), vol. 0(2), pages 155-178.
    19. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArchive 2b5k4_v1, Center for Open Science.
    20. Andreas Schneck, 2023. "Are most published research findings false? Trends in statistical power, publication selection bias, and the false discovery rate in psychology (1975–2017)," PLOS ONE, Public Library of Science, vol. 18(10), pages 1-18, October.
    21. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2012. "Heterogeneity of Inter-Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-10, October.
    22. Uddin, Shahadat & Khan, Arif & Lu, Haohui, 2023. "Impact of COVID-19 on Journal Impact Factor," Journal of Informetrics, Elsevier, vol. 17(4).
    23. Kathryn Vasilaky & Sofía Martínez Sáenz & Radost Stanimirova & Daniel Osgood, 2020. "Perceptions of Farm Size Heterogeneity and Demand for Group Index Insurance," Games, MDPI, vol. 11(1), pages 1-21, March.
    24. Hans van Dijk & Marino van Zelst, 2020. "Comfortably Numb? Researchers’ Satisfaction with the Publication System and a Proposal for Radical Change," Publications, MDPI, vol. 8(1), pages 1-20, March.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:7dc6a. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.