IDEAS home Printed from https://ideas.repec.org/r/bla/jorssa/v166y2003i3p279-300.html
   My bibliography  Save this item

A multilevel cross‐classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings

Citations

Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as


Cited by:

  1. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
  2. Teplitskiy, Misha & Acuna, Daniel & Elamrani-Raoult, Aïda & Körding, Konrad & Evans, James, 2018. "The sociology of scientific validity: How professional networks shape judgement in peer review," Research Policy, Elsevier, vol. 47(9), pages 1825-1841.
  3. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
  4. Stephen A Gallo & Afton S Carpenter & Scott R Glisson, 2013. "Teleconference versus Face-to-Face Scientific Peer Review of Grant Application: Effects on Review Outcomes," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-9, August.
  5. Bol, Thijs & de Vaan, Mathijs & van de Rijt, Arnout, 2022. "Gender-equal funding rates conceal unequal evaluations," Research Policy, Elsevier, vol. 51(1).
  6. Bayindir, Esra Eren & Gurdal, Mehmet Yigit & Saglam, Ismail, 2019. "A Game Theoretic Approach to Peer Review of Grant Proposals," Journal of Informetrics, Elsevier, vol. 13(4).
  7. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
  8. Manuel Bagues & Mauro Sylos-Labini & Natalia Zinovyeva, 2017. "Does the Gender Composition of Scientific Committees Matter?," American Economic Review, American Economic Association, vol. 107(4), pages 1207-1238, April.
  9. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "Latent Markov modeling applied to grant peer review," Journal of Informetrics, Elsevier, vol. 2(3), pages 217-228.
  10. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
  11. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
  12. Bornmann, Lutz & Mutz, Rüdiger & Hug, Sven E. & Daniel, Hans-Dieter, 2011. "A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants," Journal of Informetrics, Elsevier, vol. 5(3), pages 346-359.
  13. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
  14. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
  15. Marco Seeber & Jef Vlegels & Mattia Cattaneo, 2022. "Conditions that do or do not disadvantage interdisciplinary research proposals in project evaluation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(8), pages 1106-1126, August.
  16. Linton, Jonathan D., 2016. "Improving the Peer review process: Capturing more information and enabling high-risk/high-return research," Research Policy, Elsevier, vol. 45(9), pages 1936-1938.
  17. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
  18. Cassidy R. Sugimoto & Blaise Cronin, 2013. "Citation gamesmanship: testing for evidence of ego bias in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(3), pages 851-862, June.
  19. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
  20. Wiltrud Kuhlisch & Magnus Roos & Jörg Rothe & Joachim Rudolph & Björn Scheuermann & Dietrich Stoyan, 2016. "A statistical approach to calibrating the scores of biased reviewers of scientific papers," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 79(1), pages 37-57, January.
  21. Wen Luo & Oi-man Kwok, 2012. "The Consequences of Ignoring Individuals' Mobility in Multilevel Growth Models," Journal of Educational and Behavioral Statistics, , vol. 37(1), pages 31-56, February.
  22. Patrícia Martinková & Dan Goldhaber & Elena Erosheva, 2018. "Disparities in ratings of internal and external applicants: A case for model-based inter-rater reliability," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-17, October.
  23. Marsh, Herbert W. & Jayasinghe, Upali W. & Bond, Nigel W., 2011. "Gender differences in peer reviews of grant applications: A substantive-methodological synergy in support of the null hypothesis model," Journal of Informetrics, Elsevier, vol. 5(1), pages 167-180.
  24. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "How to detect indications of potential sources of bias in peer review: A generalized latent variable modeling approach exemplified by a gender study," Journal of Informetrics, Elsevier, vol. 2(4), pages 280-287.
  25. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
  26. Miguel Navascués & Costantino Budroni, 2019. "Theoretical research without projects," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-35, March.
  27. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2015. "Testing for the fairness and predictive validity of research funding decisions: A multilevel multiple imputation for missing data approach using ex-ante and ex-post peer evaluation data from the Austr," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2321-2339, November.
  28. Wen Luo & Oi-Man Kwok, 2010. "Proportional Reduction of Prediction Error in Cross-Classified Random Effects Models," Sociological Methods & Research, , vol. 39(2), pages 188-205, November.
  29. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.