IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v120y2019i2d10.1007_s11192-019-03018-x.html
   My bibliography  Save this article

Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation

Author

Listed:
  • Lutz Bornmann

    (Administrative Headquarters of the Max Planck Society)

  • Julian N. Marewski

    (Université de Lausanne)

Abstract

While bibliometrics are widely used for research evaluation purposes, a common theoretical framework for conceptually understanding, empirically studying, and effectively teaching its usage is lacking. In this paper, we outline such a framework: the fast-and-frugal heuristics research program, proposed originally in the context of the cognitive and decision sciences, lends itself particularly well for understanding and investigating the usage of bibliometrics in research evaluations. Such evaluations represent judgments under uncertainty in which typically not all possible options, their consequences, and those consequences’ probabilities of occurring may be known. In these situations of incomplete information, candidate descriptive and prescriptive models of human behavior are heuristics. Heuristics are simple strategies that, by exploiting the structure of environments, can aid people to make smart decisions. Relying on heuristics does not mean trading off accuracy against effort: while reducing complexity, heuristics can yield better decisions than more information-greedy procedures in many decision environments. The prescriptive power of heuristics is documented in a cross-disciplinary literature, cutting across medicine, crime, business, sports, and other domains. We outline the fast-and-frugal heuristics research program, provide examples of past empirical work on heuristics outside the field of bibliometrics, explain why heuristics may be especially suitable for studying the usage of bibliometrics, and propose a corresponding conceptual framework.

Suggested Citation

  • Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
  • Handle: RePEc:spr:scient:v:120:y:2019:i:2:d:10.1007_s11192-019-03018-x
    DOI: 10.1007/s11192-019-03018-x
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-019-03018-x
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-019-03018-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Gigerenzer, Gerd & Todd, Peter M. & ABC Research Group,, 2000. "Simple Heuristics That Make Us Smart," OUP Catalogue, Oxford University Press, number 9780195143812.
    2. Gad Saad, 2006. "Exploring the h-index at the author and journal levels using bibliometric data of productive consumer scholars and business-related journals respectively," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 117-120, October.
    3. Scheibehenne, Benjamin & Broder, Arndt, 2007. "Predicting Wimbledon 2005 tennis results by mere player name recognition," International Journal of Forecasting, Elsevier, vol. 23(3), pages 415-426.
    4. Moed, H. F. & Burger, W. J. M. & Frankfort, J. G. & Van Raan, A. F. J., 1985. "The use of bibliometric data for the measurement of university research performance," Research Policy, Elsevier, vol. 14(3), pages 131-149, June.
    5. Lutz Bornmann, 2015. "Complex tasks and simple solutions: The use of heuristics in the evaluation of research," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(8), pages 1738-1739, August.
    6. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
    7. Pinheiro, Diogo & Melkers, Julia & Youtie, Jan, 2014. "Learning to play the game: Student publishing as an indicator of future scholarly success," Technological Forecasting and Social Change, Elsevier, vol. 81(C), pages 56-66.
    8. Phillips, Nathaniel D. & Neth, Hansjörg & Woike, Jan K. & Gaissmaier, Wolfgang, 2017. "FFTrees: A toolbox to create, visualize, and evaluate fast-and-frugal decision trees," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 12(4), pages 344-368.
    9. Ken Binmore, 2007. "Rational Decisions in Large Worlds," Annals of Economics and Statistics, GENES, issue 86, pages 25-41.
    10. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    11. Herbert A. Simon, 1955. "A Behavioral Model of Rational Choice," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 69(1), pages 99-118.
    12. Andreas Glöckner & Tilmann Betsch, 2008. "Modeling Option and Strategy Choices with Connectionist Networks: Towards an Integrative Model of Automatic and Deliberate Decision Making," Discussion Paper Series of the Max Planck Institute for Research on Collective Goods 2008_02, Max Planck Institute for Research on Collective Goods.
    13. Björn Hammarfelt & Gaby Haddow, 2018. "Conflicting measures and values: How humanities scholars in Australia and Sweden use and react to bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(7), pages 924-935, July.
    14. repec:cup:judgdm:v:6:y:2011:i:1:p:1-6 is not listed on IDEAS
    15. Florian M. Artinger & Sabrina Artinger & Gerd Gigerenzer, 2019. "C. Y. A.: frequency and causes of defensive decisions in public administration," Business Research, Springer;German Academic Association for Business Research, vol. 12(1), pages 9-25, April.
    16. Luan, Shenghua & Reb, Jochen, 2017. "Fast-and-frugal trees as noncompensatory models of performance-based personnel decisions," Organizational Behavior and Human Decision Processes, Elsevier, vol. 141(C), pages 29-42.
    17. Colin Macilwain, 2013. "Halt the avalanche of performance metrics," Nature, Nature, vol. 500(7462), pages 255-255, August.
    18. Shabnam Mousavi & Gerd Gigerenzer, 2017. "Heuristics are Tools for Uncertainty," Homo Oeconomicus: Journal of Behavioral and Institutional Economics, Springer, vol. 34(4), pages 361-379, December.
    19. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    20. repec:cup:judgdm:v:12:y:2017:i:4:p:344-368 is not listed on IDEAS
    21. repec:cup:judgdm:v:3:y:2008:i::p:215-228 is not listed on IDEAS
    22. repec:cup:judgdm:v:6:y:2011:i:5:p:359-380 is not listed on IDEAS
    23. Gangan Prathap, 2014. "Single parameter indices and bibliometric outliers," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(3), pages 1781-1787, December.
    24. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    25. Sarah de Rijcke & Paul F. Wouters & Alex D. Rushforth & Thomas P. Franssen & Björn Hammarfelt, 2016. "Evaluation practices and effects of indicator use—a literature review," Research Evaluation, Oxford University Press, vol. 25(2), pages 161-169.
    26. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    27. Peter Weingart, 2005. "Impact of bibliometrics upon the science system: Inadvertent consequences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 62(1), pages 117-131, January.
    28. Konstantinos V. Katsikopoulos, 2011. "Psychological Heuristics for Making Inferences: Definition, Performance, and the Emerging Theory and Practice," Decision Analysis, INFORMS, vol. 8(1), pages 10-29, March.
    29. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    30. Thomas Heinze, 2013. "Creative accomplishments in science: definition, theoretical considerations, examples from science history, and bibliometric findings," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(3), pages 927-940, June.
    31. V. A. Traag & L. Waltman, 2019. "Systematic analysis of agreement between metrics and peer review in the UK REF," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-12, December.
    32. Salil Gunashekar & Steven Wooding & Susan Guthrie, 2017. "How do NIHR peer review panels use bibliometric information to support their decisions?," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1813-1835, September.
    33. Frenken, Koen & Hardeman, Sjoerd & Hoekman, Jarno, 2009. "Spatial scientometrics: Towards a cumulative research program," Journal of Informetrics, Elsevier, vol. 3(3), pages 222-232.
    34. repec:adr:anecst:y:2007:i:86:p:04 is not listed on IDEAS
    35. repec:cup:judgdm:v:6:y:2011:i:1:p:89-99 is not listed on IDEAS
    36. Hauser, John R & Wernerfelt, Birger, 1990. "An Evaluation Cost Model of Consideration Sets," Journal of Consumer Research, Journal of Consumer Research Inc., vol. 16(4), pages 393-408, March.
    37. Mousavi, Shabnam & Gigerenzer, Gerd, 2014. "Risk, uncertainty, and heuristics," Journal of Business Research, Elsevier, vol. 67(8), pages 1671-1678.
    38. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    39. Goldstein, Daniel G. & Gigerenzer, Gerd, 2009. "Fast and frugal forecasting," International Journal of Forecasting, Elsevier, vol. 25(4), pages 760-772, October.
    40. repec:cup:judgdm:v:6:y:2011:i:1:p:73-88 is not listed on IDEAS
    41. repec:cup:judgdm:v:6:y:2011:i:6:p:439-519 is not listed on IDEAS
    42. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees' decisions," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 297-320, April.
    43. Iman Tahamtan & Askar Safipour Afshar & Khadijeh Ahamdzadeh, 2016. "Factors affecting number of citations: a comprehensive review of the literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1195-1225, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Gangan Prathap, 2019. "Letter to the editor: Revisiting the h-index and the p-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(3), pages 1829-1833, December.
    2. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    3. Brito, Ricardo & Navarro, Alonso Rodríguez, 2021. "The inconsistency of h-index: A mathematical analysis," Journal of Informetrics, Elsevier, vol. 15(1).
    4. Jan Ellinger, 2023. "Don't put the greatest pressure on the weakest," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5853-5857, October.
    5. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    6. Lutz Bornmann, 2020. "Bibliometrics-based decision tree (BBDT) for deciding whether two universities in the Leiden ranking differ substantially in their performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1255-1258, February.
    7. Amrollah Shamsi & Rafaela Carolina Silva & Ting Wang & N. Vasantha Raju & Karen Santos-d’Amorim, 2022. "A grey zone for bibliometrics: publications indexed in Web of Science as anonymous," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5989-6009, October.
    8. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    9. Gössling, Stefan & Moyle, Brent D. & Weaver, David, 2021. "Academic entrepreneurship: A bibliometric engagement model," Annals of Tourism Research, Elsevier, vol. 90(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    2. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    3. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    4. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    5. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    6. Sabrina Petersohn & Thomas Heinze, 2018. "Professionalization of bibliometric research assessment. Insights from the history of the Leiden Centre for Science and Technology Studies (CWTS)," Science and Public Policy, Oxford University Press, vol. 45(4), pages 565-578.
    7. Yurij L. Katchanov & Yulia V. Markova, 2017. "The “space of physics journals”: topological structure and the Journal Impact Factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 313-333, October.
    8. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    9. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2019. "Evaluating research and researchers by the journal impact factor: Is it better than coin flipping?," Journal of Informetrics, Elsevier, vol. 13(1), pages 314-324.
    10. David A. Pendlebury, 2019. "Charting a path between the simple and the false and the complex and unusable: Review of Henk F. Moed, Applied Evaluative Informetrics [in the series Qualitative and Quantitative Analysis of Scientifi," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 549-560, April.
    11. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    12. Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
    13. Tatiana Marina & Ivan Sterligov, 2021. "Prevalence of potentially predatory publishing in Scopus on the country level," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 5019-5077, June.
    14. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    15. Lutz Bornmann & Loet Leydesdorff, 2018. "Count highly-cited papers instead of papers with h citations: use normalized citation counts and compare “like with like”!," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1119-1123, May.
    16. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    17. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
    18. Domingo Docampo & Lawrence Cram, 2019. "Highly cited researchers: a moving target," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1011-1025, March.
    19. Gregorio González-Alcaide, 2021. "Bibliometric studies outside the information science and library science field: uncontainable or uncontrollable?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6837-6870, August.
    20. Mehdi Rhaiem & Nabil Amara, 2020. "Determinants of research efficiency in Canadian business schools: evidence from scholar-level data," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 53-99, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:120:y:2019:i:2:d:10.1007_s11192-019-03018-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.