IDEAS home Printed from https://ideas.repec.org/a/pal/palcom/v9y2022i1d10.1057_s41599-022-01050-6.html
   My bibliography  Save this article

Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences

Author

Listed:
  • Gaëlle Vallée-Tourangeau

    (Kingston Business School, Kingston University)

  • Ana Wheelock

    (Kingston Business School, Kingston University)

  • Tushna Vandrevala

    (University of London)

  • Priscilla Harries

    (University of London)

Abstract

Independent evaluations of grant applications by subject experts are an important part of the peer-review system. However, little is known about the real-time experiences of peer reviewers or experts who perform reviews of a grant application independently. This study sought to gain insight into this stage of the grant evaluation process by observing how experts conduct an independent review in near real time. Using the think aloud approach and Critical Decision Method of interviewing, in-depth interviews were conducted with 16 peer reviewers from a range of roles and disciplines within the medical humanities and social sciences. Participants were asked to think aloud while reviewing applications to different grant schemes from a single prestigious funder. The analysis shows reviewers encountered five dilemmas during the evaluation process. These dilemmas were related to whether or not one should (1) accept an invitation to review, (2) rely exclusively on the information presented in the application, (3) pay attention to institutional prestige, (4) offer comments about aspects that are not directly related to academics’ area of expertise, and (5) to take risks and overlook shortcomings rather than err on the side of caution. In order to decide on the appropriate course of action, reviewers often engaged in a series of deliberations and trade-offs—varying in length and complexity. However, their interpretation of what was ‘right’ was influenced by their values, preferences and experiences, but also by relevant norms and their understanding of the funder’s guidelines and priorities. As a result, the way reviewers approached the identified dilemmas was idiosyncratic and sometimes diametrically opposed to other reviewers’ views, which could lead to variation in peer-review outcomes. The dilemmas we have uncovered suggest that peer reviewers engage in thoughtful considerations during the peer-review process. We should, therefore, be wary of reducing the absence of consensus as resulting from biased, instinctive thinking. Rather, these findings highlight the diversity of values, priorities and habits and ways of working each reviewer brings to the fore when reviewing the applicants and their project proposals and call for further reflection on, and study of, this “invisible work” to better understand and continue to improve the peer-reviewing process.

Suggested Citation

  • Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
  • Handle: RePEc:pal:palcom:v:9:y:2022:i:1:d:10.1057_s41599-022-01050-6
    DOI: 10.1057/s41599-022-01050-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1057/s41599-022-01050-6
    File Function: Abstract
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1057/s41599-022-01050-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Viner, Neil & Powell, Philip & Green, Rod, 2004. "Institutionalized biases in the award of research grants: a preliminary analysis revisiting the principle of accumulative advantage," Research Policy, Elsevier, vol. 33(3), pages 443-454, April.
    2. Meike Olbrecht & Lutz Bornmann, 2010. "Panel peer review of grant applications: what do we know from research in social psychology on judgment and decision-making in groups?," Research Evaluation, Oxford University Press, vol. 19(4), pages 293-304, October.
    3. Upali W. Jayasinghe & Herbert W. Marsh & Nigel Bond, 2003. "A multilevel cross‐classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 166(3), pages 279-300, October.
    4. Irwin Feller, 2006. "Multiple actors, multiple settings, multiple criteria: issues in assessing interdisciplinary research," Research Evaluation, Oxford University Press, vol. 15(1), pages 5-15, April.
    5. Albert N. Link & Nicholas S. Vonortas (ed.), 2013. "Handbook on the Theory and Practice of Program Evaluation," Books, Edward Elgar Publishing, number 14384.
    6. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    7. Liv Langfeldt, 2006. "The policy challenges of peer review: managing bias, conflict of interests and interdisciplinary assessments," Research Evaluation, Oxford University Press, vol. 15(1), pages 31-41, April.
    8. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    9. Esser, James K., 1998. "Alive and Well after 25 Years: A Review of Groupthink Research," Organizational Behavior and Human Decision Processes, Elsevier, vol. 73(2-3), pages 116-141, February.
    10. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    11. Hendy Abdoul & Christophe Perrey & Florence Tubach & Philippe Amiel & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Non-Financial Conflicts of Interest in Academic Grant Evaluation: A Qualitative Study of Multiple Stakeholders in France," PLOS ONE, Public Library of Science, vol. 7(4), pages 1-10, April.
    12. Messick, David M., 1999. "Alternative logics for decision making in social settings," Journal of Economic Behavior & Organization, Elsevier, vol. 39(1), pages 11-28, May.
    13. Jennifer Dykema & John Stevenson & Lisa Klein & Yujin Kim & Brendan Day, "undated". "Effects of E-Mailed Versus Mailed Invitations and Incentives on Response Rates, Data Quality, and Costs in a Web Survey of University Faculty," Mathematica Policy Research Reports d3e2eb6640e040ee943fd8b80, Mathematica Policy Research.
    14. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    15. Gary Charness & Matthias Sutter, 2012. "Groups Make Better Self-Interested Decisions," Journal of Economic Perspectives, American Economic Association, vol. 26(3), pages 157-176, Summer.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    2. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    3. Lawson, Cornelia & Salter, Ammon, 2023. "Exploring the effect of overlapping institutional applications on panel decision-making," Research Policy, Elsevier, vol. 52(9).
    4. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    5. Wiltrud Kuhlisch & Magnus Roos & Jörg Rothe & Joachim Rudolph & Björn Scheuermann & Dietrich Stoyan, 2016. "A statistical approach to calibrating the scores of biased reviewers of scientific papers," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 79(1), pages 37-57, January.
    6. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2015. "Testing for the fairness and predictive validity of research funding decisions: A multilevel multiple imputation for missing data approach using ex-ante and ex-post peer evaluation data from the Austr," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2321-2339, November.
    7. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    8. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    9. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    10. Oviedo-García, M. Ángeles, 2016. "Tourism research quality: Reviewing and assessing interdisciplinarity," Tourism Management, Elsevier, vol. 52(C), pages 586-592.
    11. Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
    12. Teplitskiy, Misha & Acuna, Daniel & Elamrani-Raoult, Aïda & Körding, Konrad & Evans, James, 2018. "The sociology of scientific validity: How professional networks shape judgement in peer review," Research Policy, Elsevier, vol. 47(9), pages 1825-1841.
    13. Wang, Qi & Sandström, Ulf, 2014. "Defining the Role of Cognitive Distance in the Peer Review Process: Explorative Study of a Grant Scheme in Infection Biology," INDEK Working Paper Series 2014/10, Royal Institute of Technology, Department of Industrial Economics and Management.
    14. Jürgen Janger & Nicole Schmidt & Anna Strauss, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, April.
    15. Meyer, Matthias & Waldkirch, Rüdiger W. & Duscher, Irina & Just, Alexander, 2018. "Drivers of citations: An analysis of publications in “top” accounting journals," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 51(C), pages 24-46.
    16. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    17. Andrada Elena Urda-Cîmpean & Sorana D. Bolboacă & Andrei Achimaş-Cadariu & Tudor Cătălin Drugan, 2016. "Knowledge Production in Two Types of Medical PhD Routes—What’s to Gain?," Publications, MDPI, vol. 4(2), pages 1-16, June.
    18. Randa Alsabahi, 2022. "English Medium Publications: Opening or Closing Doors to Authors with Non-English Language Backgrounds," English Language Teaching, Canadian Center of Science and Education, vol. 15(10), pages 1-18, October.
    19. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    20. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pal:palcom:v:9:y:2022:i:1:d:10.1057_s41599-022-01050-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: https://www.nature.com/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.