IDEAS home Printed from https://ideas.repec.org/a/oup/rseval/v31y2022i1p93-103..html
   My bibliography  Save this article

The Corona-Eye: Exploring the risks of COVID-19 on fair assessments of impact for REF2021

Author

Listed:
  • Gemma E Derrick
  • Julie Bayley

Abstract

This article assesses the risk of two COVID-19-related changes necessary for the expert review of the REF2021’s Impact criterion: the move from face to face (F2F) to virtual deliberation; and the changing research landscape caused by the COVID-19 crisis requiring an extension of deadlines, and accommodation of COVID-19-related mitigation. Peer review in its basic form requires expert debate, where dissenting opinions and non-verbal cues are absorbed into a group deliberative practice and therefore inform outcomes. With a move to deliberations in virtual settings, the most likely current outcome for REF2021 evaluations, the extent that negotiation dynamics necessary in F2F evaluations are diminished and how this limits panellists’ ability to sensitively assess COVID-19 mitigation statements is questioned. This article explores the nature of, and associated capabilities to undertake, complex decision-making in virtual settings around the Impact criterion as well the consequences of COVID-19 on normal Impact trajectories. It examines the risks these changes present for evaluation of the Impact criterion and provides recommendations to offset these risks to enhance discussion and safeguard the legitimacy of evaluation outcomes. This article is also relevant for evaluation processes of academic criteria that require both a shift to virtual, and/or guidance of how to sensitively assess the effect of COVID-19 on narratives of individual, group or organizational performance.

Suggested Citation

  • Gemma E Derrick & Julie Bayley, 2022. "The Corona-Eye: Exploring the risks of COVID-19 on fair assessments of impact for REF2021," Research Evaluation, Oxford University Press, vol. 31(1), pages 93-103.
  • Handle: RePEc:oup:rseval:v:31:y:2022:i:1:p:93-103.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/reseval/rvab033
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Dalmeet Singh Chawla, 2021. "Zoom fatigue saps grant reviewers’ attention," Nature, Nature, vol. 590(7844), pages 172-172, February.
    2. Alan R. Dennis & Joseph S. Valacich & Terry Connolly & Bayard E. Wynne, 1996. "Process Structuring in Electronic Brainstorming," Information Systems Research, INFORMS, vol. 7(2), pages 268-277, June.
    3. Stephen A Gallo & Afton S Carpenter & Scott R Glisson, 2013. "Teleconference versus Face-to-Face Scientific Peer Review of Grant Application: Effects on Review Outcomes," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-9, August.
    4. Samantha Cruz Rivera & Derek G Kyte & Olalekan Lee Aiyegbusi & Thomas J Keeley & Melanie J Calvert, 2017. "Assessing the impact of healthcare research: A systematic review of methodological frameworks," PLOS Medicine, Public Library of Science, vol. 14(8), pages 1-24, August.
    5. Gary Baker, 2002. "The Effects of Synchronous Collaborative Technologies on Decision Making: A Study of Virtual Teams," Information Resources Management Journal (IRMJ), IGI Global, vol. 15(4), pages 79-93, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    2. Walczuch, R.M. & Hofmaier, K., 2000. "Measuring customer satisfaction on the Internet," Research Memorandum 051, Maastricht University, Maastricht Research School of Economics of Technology and Organization (METEOR).
    3. Sergey R. Yagolkovskiy & Anatoliy V. Kharkhurin, 2015. "The Roles of Novelty and the Organization of Stimulus Material in Divergent Thinking," HSE Working papers WP BRP 41/PSY/2015, National Research University Higher School of Economics.
    4. Antonio Ferreira & Pedro Antunes & Valeria Herskovic, 2011. "Improving Group Attention: An Experiment with Synchronous Brainstorming," Group Decision and Negotiation, Springer, vol. 20(5), pages 643-666, September.
    5. Walczuch R & Hofmaier K, 2000. "Measuring Customer Satisfaction on the Internet," Research Memorandum 019, Maastricht University, Maastricht Research School of Economics of Technology and Organization (METEOR).
    6. Hambley, Laura A. & O'Neill, Thomas A. & Kline, Theresa J.B., 2007. "Virtual team leadership: The effects of leadership style and communication medium on team interaction styles and outcomes," Organizational Behavior and Human Decision Processes, Elsevier, vol. 103(1), pages 1-20, May.
    7. J. H. Jung & Christoph Schneider & Joseph Valacich, 2010. "Enhancing the Motivational Affordance of Information Systems: The Effects of Real-Time Performance Feedback and Goal Setting in Group Collaboration Environments," Management Science, INFORMS, vol. 56(4), pages 724-742, April.
    8. Helka Kalliomäki & Sampo Ruoppila & Jenni Airaksinen, 2021. "It takes two to tango: Examining productive interactions in urban research collaboration [Generating Research Questions through Problematization]," Research Evaluation, Oxford University Press, vol. 30(4), pages 529-539.
    9. Kieran Mathieson, 2007. "Towards a Design Science of Ethical Decision Support," Journal of Business Ethics, Springer, vol. 76(3), pages 269-292, December.
    10. Heyeres, Marion & Tsey, Komla & Yang, Yinghong & Yan, Li & Jiang, Hua, 2019. "The characteristics and reporting quality of research impact case studies: A systematic review," Evaluation and Program Planning, Elsevier, vol. 73(C), pages 10-23.
    11. Deanna House, 2012. "Factors that Inhibit Globally Distributed Software Development Teams," International Journal of Business and Social Research, MIR Center for Socio-Economic Research, vol. 2(6), pages 135-153, November.
    12. Stephen A Gallo & Afton S Carpenter & David Irwin & Caitlin D McPartland & Joseph Travis & Sofie Reynders & Lisa A Thompson & Scott R Glisson, 2014. "The Validation of Peer Review through Research Impact Measures and the Implications for Funding Strategies," PLOS ONE, Public Library of Science, vol. 9(9), pages 1-9, September.
    13. Joel B E Smith & Keith Channon & Vasiliki Kiparoglou & John F Forbes & Alastair M Gray, 2019. "A macroeconomic assessment of the impact of medical research expenditure: A case study of NIHR Biomedical Research Centres," PLOS ONE, Public Library of Science, vol. 14(4), pages 1-10, April.
    14. Gillier, Thomas & Chaffois, Cédric & Belkhouja, Mustapha & Roth, Yannig & Bayus, Barry L., 2018. "The effects of task instructions in crowdsourcing innovative ideas," Technological Forecasting and Social Change, Elsevier, vol. 134(C), pages 35-44.
    15. Sajda Qureshi & Min Liu & Doug Vogel, 2006. "The Effects of Electronic Collaboration in Distributed Project Management," Group Decision and Negotiation, Springer, vol. 15(1), pages 55-75, January.
    16. Miriam L E Steiner Davis & Tiffani R Conner & Kate Miller-Bains & Leslie Shapard, 2020. "What makes an effective grants peer reviewer? An exploratory study of the necessary skills," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-22, May.
    17. Katie Meadmore & Kathryn Fackrell & Alejandra Recio-Saucedo & Abby Bull & Simon D S Fraser & Amanda Blatch-Jones, 2020. "Decision-making approaches used by UK and international health funding organisations for allocating research funds: A survey of current practice," PLOS ONE, Public Library of Science, vol. 15(11), pages 1-17, November.
    18. D Shaw & M Westcombe & J Hodgkin & G Montibeller, 2004. "Problem structuring methods for large group interventions," Journal of the Operational Research Society, Palgrave Macmillan;The OR Society, vol. 55(5), pages 453-463, May.
    19. D Shaw, 2003. "Evaluating electronic workshops through analysing the ‘brainstormed’ ideas," Journal of the Operational Research Society, Palgrave Macmillan;The OR Society, vol. 54(7), pages 692-705, July.
    20. Deanna House, 2012. "Factors that Inhibit Globally Distributed Software Development Teams," International Journal of Business and Social Research, LAR Center Press, vol. 2(6), pages 135-153, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:rseval:v:31:y:2022:i:1:p:93-103.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/rev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.