IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v125y2020i1d10.1007_s11192-020-03649-5.html
   My bibliography  Save this article

What is meaningful research and how should we measure it?

Author

Listed:
  • Sven Helmer

    (University of Zurich)

  • David B. Blumenthal

    (Technical University of Munich)

  • Kathrin Paschen

    (Nephometrics GmbH)

Abstract

We discuss the trend towards using quantitative metrics for evaluating research. We claim that, rather than promoting meaningful research, purely metric-based research evaluation schemes potentially lead to a dystopian academic reality, leaving no space for creativity and intellectual initiative. After sketching what the future could look like if quantitative metrics are allowed to proliferate, we provide a more detailed discussion on why research is so difficult to evaluate and outline approaches for avoiding such a situation. In particular, we characterize meaningful research as an essentially contested concept and argue that quantitative metrics should always be accompanied by operationalized instructions for their proper use and continuously evaluated via feedback loops. Additionally, we analyze a dataset containing information about computer science publications and their citation history and indicate how quantitative metrics could potentially be calibrated via alternative evaluation methods such as test of time awards. Finally, we argue that, instead of over-relying on indicators, research environments should primarily be based on trust and personal responsibility.

Suggested Citation

  • Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
  • Handle: RePEc:spr:scient:v:125:y:2020:i:1:d:10.1007_s11192-020-03649-5
    DOI: 10.1007/s11192-020-03649-5
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-020-03649-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-020-03649-5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Amber Dance, 2017. "Flexible working: Solo scientist," Nature, Nature, vol. 543(7647), pages 747-749, March.
    2. John P. A. Ioannidis & Richard Klavans & Kevin W. Boyack, 2018. "Thousands of scientists publish a paper every five days," Nature, Nature, vol. 561(7722), pages 167-169, September.
    3. Kaare Aagaard & Carter Bloch & Jesper W. Schneider, 2015. "Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator," Research Evaluation, Oxford University Press, vol. 24(2), pages 106-117.
    4. Virginia Gewin, 2012. "Research: Uncovering misconduct," Nature, Nature, vol. 485(7396), pages 137-139, May.
    5. Jeffrey Beall, 2012. "Predatory publishers are corrupting open access," Nature, Nature, vol. 489(7415), pages 179-179, September.
    6. Aksnes, Dag W. & Rip, Arie, 2009. "Researchers' perceptions of citations," Research Policy, Elsevier, vol. 38(6), pages 895-905, July.
    7. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    8. R Grant Steen & Arturo Casadevall & Ferric C Fang, 2013. "Why Has the Number of Scientific Retractions Increased?," PLOS ONE, Public Library of Science, vol. 8(7), pages 1-9, July.
    9. Carolin Michels & Ulrich Schmoch, 2014. "Impact of bibliometric studies on the publication behaviour of authors," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 369-385, January.
    10. Sven E. Hug & Michael Ochsner & Hans-Dieter Daniel, 2013. "Criteria for assessing research quality in the humanities: a Delphi study among scholars of English literature, German literature and art history," Research Evaluation, Oxford University Press, vol. 22(5), pages 369-383, August.
    11. Fredrik Niclas Piro & Dag W. Aksnes & Kristoffer Rørstad, 2013. "A macro analysis of productivity differences across fields: Challenges in the measurement of scientific publishing," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 307-320, February.
    12. Richard Van Noorden, 2011. "Science publishing: The trouble with retractions," Nature, Nature, vol. 478(7367), pages 26-28, October.
    13. Mårtensson, Pär & Fors, Uno & Wallin, Sven-Bertil & Zander, Udo & Nilsson, Gunnar H, 2016. "Evaluating research: A multidisciplinary approach to assessing research practice and quality," Research Policy, Elsevier, vol. 45(3), pages 593-603.
    14. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    15. Peter Weingart, 2005. "Impact of bibliometrics upon the science system: Inadvertent consequences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 62(1), pages 117-131, January.
    16. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    17. Fredrik Niclas Piro & Dag W. Aksnes & Kristoffer Rørstad, 2013. "A macro analysis of productivity differences across fields: Challenges in the measurement of scientific publishing," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 307-320, February.
    18. Björn Hammarfelt & Alexander D. Rushforth, 2017. "Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation," Research Evaluation, Oxford University Press, vol. 26(3), pages 169-180.
    19. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Carolina Llorente & Gema Revuelta, 2023. "Models of Teaching Science Communication," Sustainability, MDPI, vol. 15(6), pages 1-22, March.
    2. Antonio Fernandez-Cano, 2021. "Letter to the Editor: publish, publish … cursed!," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3673-3682, April.
    3. Edré Moreira & Wagner Meira & Marcos André Gonçalves & Alberto H. F. Laender, 2023. "The rise of hyperprolific authors in computer science: characterization and implications," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(5), pages 2945-2974, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    2. Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
    3. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    4. Ramón A. Feenstra & Emilio Delgado López-Cózar, 2022. "Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 2085-2103, April.
    5. Anne K. Krüger, 2020. "Quantification 2.0? Bibliometric Infrastructures in Academic Evaluation," Politics and Governance, Cogitatio Press, vol. 8(2), pages 58-67.
    6. Dorte Henriksen, 2016. "The rise in co-authorship in the social sciences (1980–2013)," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(2), pages 455-476, May.
    7. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    8. Torger Möller & Marion Schmidt & Stefan Hornbostel, 2016. "Assessing the effects of the German Excellence Initiative with bibliometric methods," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2217-2239, December.
    9. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    10. Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
    11. Korytkowski, Przemyslaw & Kulczycki, Emanuel, 2019. "Publication counting methods for a national research evaluation exercise," Journal of Informetrics, Elsevier, vol. 13(3), pages 804-816.
    12. Zharova, Alona & Härdle, Wolfgang Karl & Lessmann, Stefan, 2023. "Data-driven support for policy and decision-making in university research management: A case study from Germany," European Journal of Operational Research, Elsevier, vol. 308(1), pages 353-368.
    13. Gad Yair & Keith Goldstein & Nir Rotem & Anthony J. Olejniczak, 2022. "The three cultures in American science: publication productivity in physics, history and economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 2967-2980, June.
    14. Hyeonchae Yang & Woo-Sung Jung, 2015. "A strategic management approach for Korean public research institutes based on bibliometric investigation," Quality & Quantity: International Journal of Methodology, Springer, vol. 49(4), pages 1437-1464, July.
    15. Alexander Kalgin & Olga Kalgina & Anna Lebedeva, 2019. "Publication Metrics as a Tool for Measuring Research Productivity and Their Relation to Motivation," Voprosy obrazovaniya / Educational Studies Moscow, National Research University Higher School of Economics, issue 1, pages 44-86.
    16. Gregorio González-Alcaide, 2021. "Bibliometric studies outside the information science and library science field: uncontainable or uncontrollable?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6837-6870, August.
    17. Mehdi Rhaiem & Nabil Amara, 2020. "Determinants of research efficiency in Canadian business schools: evidence from scholar-level data," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 53-99, October.
    18. Abramo, Giovanni & Aksnes, Dag W. & D’Angelo, Ciriaco Andrea, 2020. "Comparison of research performance of Italian and Norwegian professors and universities," Journal of Informetrics, Elsevier, vol. 14(2).
    19. Fabian Scheidegger & Andre Briviba & Bruno S. Frey, 2023. "Behind the curtains of academic publishing: strategic responses of economists and business scholars," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4765-4790, August.
    20. Elizabeth S. Vieira & Jorge Cerdeira, 2022. "The integration of African countries in international research networks," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1995-2021, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:125:y:2020:i:1:d:10.1007_s11192-020-03649-5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.