IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v2y2008i4p280-287.html
   My bibliography  Save this article

How to detect indications of potential sources of bias in peer review: A generalized latent variable modeling approach exemplified by a gender study

Author

Listed:
  • Bornmann, Lutz
  • Mutz, Rüdiger
  • Daniel, Hans-Dieter

Abstract

The universalism norm of the ethos of science requires that contributions to science are not excluded because of the contributors’ gender, nationality, social status, or other irrelevant criteria. Here, a generalized latent variable modeling approach is presented that grant program managers at a funding organization can use in order to obtain indications of potential sources of bias in their peer review process (such as the applicants’ gender). To implement the method, the data required are the number of approved and number of rejected applicants for grants among different groups (for example, women and men or natural and social scientists). Using the generalized latent variable modeling approach indications of potential sources of bias can be examined not only for grant peer review but also for journal peer review.

Suggested Citation

  • Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "How to detect indications of potential sources of bias in peer review: A generalized latent variable modeling approach exemplified by a gender study," Journal of Informetrics, Elsevier, vol. 2(4), pages 280-287.
  • Handle: RePEc:eee:infome:v:2:y:2008:i:4:p:280-287
    DOI: 10.1016/j.joi.2008.09.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157708000485
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2008.09.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Sophia Rabe-Hesketh & Anders Skrondal & Andrew Pickles, 2004. "GLLAMM Manual," U.C. Berkeley Division of Biostatistics Working Paper Series 1160, Berkeley Electronic Press.
    2. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Committee peer review at an international research foundation: predictive validity and fairness of selection decisions on post-graduate fellowship applications," Research Evaluation, Oxford University Press, vol. 14(1), pages 15-20, April.
    3. Lutz Bornmann & Hans-Dieter Daniel, 2006. "Potential sources of bias in research fellowship assessments: effects of university prestige and field of study," Research Evaluation, Oxford University Press, vol. 15(3), pages 209-219, December.
    4. Upali W. Jayasinghe & Herbert W. Marsh & Nigel Bond, 2003. "A multilevel cross‐classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 166(3), pages 279-300, October.
    5. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2007. "Gender differences in grant peer review: A meta-analysis," Journal of Informetrics, Elsevier, vol. 1(3), pages 226-238.
    6. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Gatekeepers of science—Effects of external reviewers’ attributes on the assessments of fellowship applications," Journal of Informetrics, Elsevier, vol. 1(1), pages 83-91.
    7. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees' decisions," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 297-320, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Corsi, Marcella & D’Ippoliti, Carlo & Zacchia, Giulia, 2019. "Diversity of backgrounds and ideas: The case of research evaluation in economics," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    2. Bradford Demarest & Guo Freeman & Cassidy R. Sugimoto, 2014. "The reviewer in the mirror: examining gendered and ethnicized notions of reciprocity in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 717-735, October.
    3. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    4. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    5. Squazzoni, Flaminio & Gandelli, Claudio, 2012. "Saint Matthew strikes again: An agent-based model of peer review and the scientific community structure," Journal of Informetrics, Elsevier, vol. 6(2), pages 265-275.
    6. Marjolijn N. Wijnen & Jorg J. M. Massen & Mariska E. Kret, 2021. "Gender bias in the allocation of student grants," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5477-5488, July.
    7. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "Latent Markov modeling applied to grant peer review," Journal of Informetrics, Elsevier, vol. 2(3), pages 217-228.
    2. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    3. Marsh, Herbert W. & Jayasinghe, Upali W. & Bond, Nigel W., 2011. "Gender differences in peer reviews of grant applications: A substantive-methodological synergy in support of the null hypothesis model," Journal of Informetrics, Elsevier, vol. 5(1), pages 167-180.
    4. Materia, V.C. & Pascucci, S. & Kolympiris, C., 2015. "Understanding the selection processes of public research projects in agriculture: The role of scientific merit," Food Policy, Elsevier, vol. 56(C), pages 87-99.
    5. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2015. "Testing for the fairness and predictive validity of research funding decisions: A multilevel multiple imputation for missing data approach using ex-ante and ex-post peer evaluation data from the Austr," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2321-2339, November.
    6. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    7. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Gatekeepers of science—Effects of external reviewers’ attributes on the assessments of fellowship applications," Journal of Informetrics, Elsevier, vol. 1(1), pages 83-91.
    8. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    9. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    10. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
    11. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    12. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Convergent validation of peer review decisions using the h index," Journal of Informetrics, Elsevier, vol. 1(3), pages 204-213.
    13. Qurat-ul Ain & Hira Riaz & Muhammad Tanvir Afzal, 2019. "Evaluation of h-index and its citation intensity based variants in the field of mathematics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 187-211, April.
    14. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    15. Dekel Omer & Schurr Amos, 2014. "Cognitive Biases in Government Procurement – An Experimental Study," Review of Law & Economics, De Gruyter, vol. 10(2), pages 1-32, July.
    16. Bol, Thijs & de Vaan, Mathijs & van de Rijt, Arnout, 2022. "Gender-equal funding rates conceal unequal evaluations," Research Policy, Elsevier, vol. 51(1).
    17. Linton, Jonathan D., 2016. "Improving the Peer review process: Capturing more information and enabling high-risk/high-return research," Research Policy, Elsevier, vol. 45(9), pages 1936-1938.
    18. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2007. "Gender differences in grant peer review: A meta-analysis," Journal of Informetrics, Elsevier, vol. 1(3), pages 226-238.
    19. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    20. Stephen A Gallo & Afton S Carpenter & Scott R Glisson, 2013. "Teleconference versus Face-to-Face Scientific Peer Review of Grant Application: Effects on Review Outcomes," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-9, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:2:y:2008:i:4:p:280-287. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.