IDEAS home Printed from https://ideas.repec.org/a/plo/pbio00/3003532.html

Most researchers would receive more recognition if assessed by article-level metrics than by journal-level metrics

Author

Listed:
  • Salsabil Arabi
  • Chaoqun Ni
  • B Ian Hutchins

Abstract

During career advancement and funding allocation decisions in biomedicine, reviewers have traditionally depended on journal-level measures of scientific influence like the impact factor. Prestigious journals reject large quantities of papers, many of which may be meritorious. It is possible that this process could create a system whereby some influential articles are prospectively identified and recognized by journal brands, but most influential articles are overlooked. Here, we measure the degree to which journal prestige hierarchies capture or overlook influential science. We quantify the fraction of scientists’ articles that would receive recognition because (a) they are published in journals above a chosen impact factor threshold, or (b) they are at least as well-cited as articles appearing in such journals. We find that the number of papers cited at least as well as those appearing in high-impact factor journals vastly exceeds the number of papers published in such venues. At the investigator level, this phenomenon extends across gender, racial, and career stage groupings of scientists. We also find that approximately half of researchers never publish in a venue with an impact factor above 15, which, under journal-level evaluation regimes, may exclude them from consideration for opportunities. Many of these researchers publish equally influential work; however, raising the possibility that the traditionally chosen journal-level measures that are routinely considered under decision-making norms, policy, or law, may recognize as little as 10%–20% of this influential work.Are authors fairly judged by assessment of the prestige of the journals in which their work is published? This study compares article level metrics with journal level metrics, finding that the vast majority of influential papers are published in lower tier journals, and that more authors, regardless of demographics, would be better recognized with article level data.

Suggested Citation

  • Salsabil Arabi & Chaoqun Ni & B Ian Hutchins, 2025. "Most researchers would receive more recognition if assessed by article-level metrics than by journal-level metrics," PLOS Biology, Public Library of Science, vol. 23(12), pages 1-18, December.
  • Handle: RePEc:plo:pbio00:3003532
    DOI: 10.1371/journal.pbio.3003532
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3003532
    Download Restriction: no

    File URL: https://journals.plos.org/plosbiology/article/file?id=10.1371/journal.pbio.3003532&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pbio.3003532?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Travis A. Hoppe & Salsabil Arabi & B. Ian Hutchins, 2023. "Predicting substantive biomedical citations without full text," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 120(30), pages 2213697120-, July.
    2. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    2. A Cecile J W Janssens & Michael Goodman & Kimberly R Powell & Marta Gwinn, 2017. "A critical evaluation of the algorithm behind the Relative Citation Ratio (RCR)," PLOS Biology, Public Library of Science, vol. 15(10), pages 1-5, October.
    3. Adrian G Barnett & Pauline Zardo & Nicholas Graves, 2018. "Randomly auditing research labs could be an affordable way to improve research quality: A simulation study," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-17, April.
    4. Mohammed S. Alqahtani & Mohamed Abbas & Mohammed Abdul Muqeet & Hussain M. Almohiy, 2022. "Research Productivity in Terms of Output, Impact, and Collaboration for University Researchers in Saudi Arabia: SciVal Analytics and t -Tests Statistical Based Approach," Sustainability, MDPI, vol. 14(23), pages 1-21, December.
    5. Thelwall, Mike, 2018. "Dimensions: A competitor to Scopus and the Web of Science?," Journal of Informetrics, Elsevier, vol. 12(2), pages 430-435.
    6. Li, Heyang & Wu, Meijun & Wang, Yougui & Zeng, An, 2022. "Bibliographic coupling networks reveal the advantage of diversification in scientific projects," Journal of Informetrics, Elsevier, vol. 16(3).
    7. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    8. Corrêa Jr., Edilson A. & Silva, Filipi N. & da F. Costa, Luciano & Amancio, Diego R., 2017. "Patterns of authors contribution in scientific manuscripts," Journal of Informetrics, Elsevier, vol. 11(2), pages 498-510.
    9. Torres-Salinas, Daniel & Valderrama-Baca, Pilar & Arroyo-Machado, Wenceslao, 2022. "Is there a need for a new journal metric? Correlations between JCR Impact Factor metrics and the Journal Citation Indicator—JCI," Journal of Informetrics, Elsevier, vol. 16(3).
    10. repec:plo:pone00:0195321 is not listed on IDEAS
    11. Joseph Staudt & Huifeng Yu & Robert P Light & Gerald Marschke & Katy Börner & Bruce A Weinberg, 2018. "High-impact and transformative science (HITS) metrics: Definition, exemplification, and comparison," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-23, July.
    12. Chun-Kai Huang & Cameron Neylon & Lucy Montgomery & Richard Hosking & James P. Diprose & Rebecca N. Handcock & Katie Wilson, 2024. "Open access research outputs receive more diverse citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(2), pages 825-845, February.
    13. Li, Xin & Tang, Xuli & Lu, Wei, 2024. "Investigating clinical links in edge-labeled citation networks of biomedical research: A translational science perspective," Journal of Informetrics, Elsevier, vol. 18(3).
    14. Heng Huang & Donghua Zhu & Xuefeng Wang, 2022. "Evaluating scientific impact of publications: combining citation polarity and purpose," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5257-5281, September.
    15. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    16. Jay Bhattacharya & Mikko Packalen, 2020. "Stagnation and Scientific Incentives," NBER Working Papers 26752, National Bureau of Economic Research, Inc.
    17. Latefa Ali Dardas & Malik Sallam & Amanda Woodward & Nadia Sweis & Narjes Sweis & Faleh A. Sawair, 2023. "Evaluating Research Impact Based on Semantic Scholar Highly Influential Citations, Total Citations, and Altmetric Attention Scores: The Quest for Refined Measures Remains Illusive," Publications, MDPI, vol. 11(1), pages 1-16, January.
    18. Loet Leydesdorff & Jordan A. Comins & Aaron A. Sorensen & Lutz Bornmann & Iina Hellsten, 2016. "Cited references and Medical Subject Headings (MeSH) as two different knowledge representations: clustering and mappings at the paper level," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2077-2091, December.
    19. John P A Ioannidis & Kevin Boyack & Paul F Wouters, 2016. "Citation Metrics: A Primer on How (Not) to Normalize," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-7, September.
    20. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2024. "Rank analysis of most cited publications, a new approach for research assessments," Journal of Informetrics, Elsevier, vol. 18(2).
    21. Živan Živković & Marija Panić, 2020. "Development of science and education in the Western Balkan countries: competitiveness with the EU," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(3), pages 2319-2339, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pbio00:3003532. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosbiology (email available below). General contact details of provider: https://journals.plos.org/plosbiology/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.