IDEAS home Printed from https://ideas.repec.org/a/oup/scippl/v34y2007i8p575-583.html
   My bibliography  Save this article

The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review

Author

Listed:
  • Henk F Moed

Abstract

The paper discusses the strengths and limitations of ‘metrics’ and peer review in large-scale evaluations of scholarly research performance. A real challenge is to combine the two methodologies in such a way that the strength of the first compensates for the limitations of the second, and vice versa. It underlines the need to systematically take into account the unintended effects of the use of metrics. It proposes a set of general criteria for the proper use of bibliometric indicators within peer-review processes, and applies these to a particular case: the UK Research Assessment Exercise (RAE). Copyright , Beech Tree Publishing.

Suggested Citation

  • Henk F Moed, 2007. "The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review," Science and Public Policy, Oxford University Press, vol. 34(8), pages 575-583, October.
  • Handle: RePEc:oup:scippl:v:34:y:2007:i:8:p:575-583
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.3152/030234207X255179
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Rabishankar Giri & Sabuj Kumar Chaudhuri, 2021. "Ranking journals through the lens of active visibility," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(3), pages 2189-2208, March.
    2. Jonas Lindahl & Rickard Danell, 2016. "The information value of early career productivity in mathematics: a ROC analysis of prediction errors in bibliometricly informed decision making," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2241-2262, December.
    3. Negin Salimi, 2017. "Quality assessment of scientific outputs using the BWM," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 195-213, July.
    4. Margit Osterloh & Bruno S. Frey, 2009. "Research governance in academia: are there alternatives to academic rankings?," IEW - Working Papers 423, Institute for Empirical Research in Economics - University of Zurich.
    5. Simoes, Nadia & Crespo, Nuno, 2020. "Self-Citations and scientific evaluation: Leadership, influence, and performance," Journal of Informetrics, Elsevier, vol. 14(1).
    6. Bertocchi, Graziella & Gambardella, Alfonso & Jappelli, Tullio & Nappi, Carmela A. & Peracchi, Franco, 2015. "Bibliometric evaluation vs. informed peer review: Evidence from Italy," Research Policy, Elsevier, vol. 44(2), pages 451-466.
    7. Pablo D’Este & Puay Tang & Surya Mahdi & Andy Neely & Mabel Sánchez-Barrioluengo, 2013. "The pursuit of academic excellence and business engagement: is it irreconcilable?," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(2), pages 481-502, May.
    8. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    9. Haddawy, Peter & Hassan, Saeed-Ul & Asghar, Awais & Amin, Sarah, 2016. "A comprehensive examination of the relation of three citation-based journal metrics to expert judgment of journal quality," Journal of Informetrics, Elsevier, vol. 10(1), pages 162-173.
    10. David A. Pendlebury, 2019. "Charting a path between the simple and the false and the complex and unusable: Review of Henk F. Moed, Applied Evaluative Informetrics [in the series Qualitative and Quantitative Analysis of Scientifi," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 549-560, April.
    11. Erich Battistin & Marco Ovidi, 2022. "Rising Stars: Expert Reviews and Reputational Yardsticks in the Research Excellence Framework," Economica, London School of Economics and Political Science, vol. 89(356), pages 830-848, October.
    12. Osterloh, Margit & Frey, Bruno S., 2020. "How to avoid borrowed plumes in academia," Research Policy, Elsevier, vol. 49(1).
    13. Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
    14. Nadia Simoes & Nuno Crespo, 2020. "A flexible approach for measuring author-level publishing performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 331-355, January.
    15. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    16. Gennaro Guida, 2018. "An Analysis of Scientific Research Performance in Italy: Evaluation Criteria and Public Funding," International Journal of Economics and Finance, Canadian Center of Science and Education, vol. 10(7), pages 1-45, July.
    17. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    18. Gabriel-Alexandru Vîiu & Mihai Păunescu, 2021. "The citation impact of articles from which authors gained monetary rewards based on journal metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4941-4974, June.
    19. Franceschini, Fiorenzo & Maisano, Domenico, 2017. "Critical remarks on the Italian research assessment exercise VQR 2011–2014," Journal of Informetrics, Elsevier, vol. 11(2), pages 337-357.
    20. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182, January.
    21. Zohreh Zahedi & Rodrigo Costas & Paul Wouters, 2014. "How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1491-1513, November.
    22. Brooks, Chris & Fenton, Evelyn M. & Walker, James T., 2014. "Gender and the evaluation of research," Research Policy, Elsevier, vol. 43(6), pages 990-1001.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:scippl:v:34:y:2007:i:8:p:575-583. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/spp .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.