IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v109y2016i2d10.1007_s11192-016-2057-4.html
   My bibliography  Save this article

Scientific credit diffusion: Researcher level or paper level?

Author

Listed:
  • Hao Wang

    (Chinese Academy of Sciences)

  • Hua-Wei Shen

    (Chinese Academy of Sciences)

  • Xue-Qi Cheng

    (Chinese Academy of Sciences)

Abstract

Scientific impact evaluation is a long-standing problem in scientometrics. Graph-ranking methods are often employed to account for the collective diffusion process of scientific credit among researchers or their publications. One key issue, however, is still up in the air: what is the appropriate level for scientific credit diffusion, researcher level or paper level? In this paper, we tackle this problem via an anatomy of the credit diffusion mechanism underlying both researcher level and paper level graph-ranking methods. We find that researcher level and paper level credit diffusions are actually two aggregations of a fine-grained authorship level credit diffusion. We further find that researcher level graph-ranking methods may cause misallocation of scientific credit, but paper level graph-ranking methods do not. Consequently, researcher level methods often fail to identify researchers with high quality but low productivity. This finding indicates that scientific credit is fundamentally derived from “paper citing paper” rather than “researcher citing researcher”. We empirically verify our findings using American Physical Review publication dataset spanning over one century.

Suggested Citation

  • Hao Wang & Hua-Wei Shen & Xue-Qi Cheng, 2016. "Scientific credit diffusion: Researcher level or paper level?," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 827-837, November.
  • Handle: RePEc:spr:scient:v:109:y:2016:i:2:d:10.1007_s11192-016-2057-4
    DOI: 10.1007/s11192-016-2057-4
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-016-2057-4
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-016-2057-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Upul Senanayake & Mahendra Piraveenan & Albert Zomaya, 2015. "The Pagerank-Index: Going beyond Citation Counts in Quantifying Scientific Impact of Researchers," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-34, August.
    2. Ying Ding, 2011. "Applying weighted PageRank to author citation networks," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(2), pages 236-245, February.
    3. Xiaorui Jiang & Xiaoping Sun & Hai Zhuge, 2013. "Graph-based algorithms for ranking researchers: not all swans are white!," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 743-759, September.
    4. Cameron Neylon & Shirley Wu, 2009. "Article-Level Metrics and the Evolution of Scientific Impact," PLOS Biology, Public Library of Science, vol. 7(11), pages 1-6, November.
    5. Chen, P. & Xie, H. & Maslov, S. & Redner, S., 2007. "Finding scientific gems with Google’s PageRank algorithm," Journal of Informetrics, Elsevier, vol. 1(1), pages 8-15.
    6. Ying Ding, 2011. "Applying weighted PageRank to author citation networks," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(2), pages 236-245, February.
    7. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    8. Andreas Strotmann & Dangzhi Zhao, 2012. "Author name disambiguation: What difference does it make in author-based citation analysis?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(9), pages 1820-1833, September.
    9. David F Klosik & Stefan Bornholdt, 2014. "The Citation Wake of Publications Detects Nobel Laureates' Papers," PLOS ONE, Public Library of Science, vol. 9(12), pages 1-9, December.
    10. Andreas Strotmann & Dangzhi Zhao, 2012. "Author name disambiguation: What difference does it make in author‐based citation analysis?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(9), pages 1820-1833, September.
    11. Kim, Jinseok & Kim, Jinmo, 2015. "Rethinking the comparison of coauthorship credit allocation schemes," Journal of Informetrics, Elsevier, vol. 9(3), pages 667-673.
    12. Qiang Wu, 2010. "The w-index: A measure to assess scientific impact by focusing on widely cited papers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(3), pages 609-614, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Antonin Macé, 2023. "The Limits of Citation Counts," Working Papers halshs-01630095, HAL.
    2. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    3. Antonin Mac'e, 2017. "The Limits of Citation Counts," Papers 1711.02695, arXiv.org, revised Sep 2023.
    4. Persson, Rasmus A.X., 2017. "Bibliometric author evaluation through linear regression on the coauthor network," Journal of Informetrics, Elsevier, vol. 11(1), pages 299-306.
    5. Siying Li & Huawei Shen & Peng Bao & Xueqi Cheng, 2021. "$$h_u$$ h u -index: a unified index to quantify individuals across disciplines," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3209-3226, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dinesh Pradhan & Partha Sarathi Paul & Umesh Maheswari & Subrata Nandi & Tanmoy Chakraborty, 2017. "$$C^3$$ C 3 -index: a PageRank based multi-faceted metric for authors’ performance measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 253-273, January.
    2. Xie, Qing & Zhang, Xinyuan & Song, Min, 2021. "A network embedding-based scholar assessment indicator considering four facets: Research topic, author credit allocation, field-normalized journal impact, and published time," Journal of Informetrics, Elsevier, vol. 15(4).
    3. Nisar Ali & Zahid Halim & Syed Fawad Hussain, 2023. "An artificial intelligence-based framework for data-driven categorization of computer scientists: a case study of world’s Top 10 computing departments," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1513-1545, March.
    4. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    5. Nadia Simoes & Nuno Crespo, 2020. "A flexible approach for measuring author-level publishing performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 331-355, January.
    6. Deming Lin & Tianhui Gong & Wenbin Liu & Martin Meyer, 2020. "An entropy-based measure for the evolution of h index research," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2283-2298, December.
    7. Dejian Yu & Wanru Wang & Shuai Zhang & Wenyu Zhang & Rongyu Liu, 2017. "A multiple-link, mutually reinforced journal-ranking model to measure the prestige of journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 521-542, April.
    8. Chen, Ying & Koch, Thorsten & Zakiyeva, Nazgul & Liu, Kailiang & Xu, Zhitong & Chen, Chun-houh & Nakano, Junji & Honda, Keisuke, 2023. "Article’s scientific prestige: Measuring the impact of individual articles in the web of science," Journal of Informetrics, Elsevier, vol. 17(1).
    9. Fiala, Dalibor, 2012. "Time-aware PageRank for bibliographic networks," Journal of Informetrics, Elsevier, vol. 6(3), pages 370-388.
    10. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    11. Yubing Nie & Yifan Zhu & Qika Lin & Sifan Zhang & Pengfei Shi & Zhendong Niu, 2019. "Academic rising star prediction via scholar’s evaluation model and machine learning techniques," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 461-476, August.
    12. Zhang, Fang & Wu, Shengli, 2020. "Predicting future influence of papers, researchers, and venues in a dynamic academic network," Journal of Informetrics, Elsevier, vol. 14(2).
    13. Yanan Wang & An Zeng & Ying Fan & Zengru Di, 2019. "Ranking scientific publications considering the aging characteristics of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 155-166, July.
    14. Bai, Xiaomei & Zhang, Fuli & Liu, Jiaying & Xia, Feng, 2023. "Quantifying the impact of scientific collaboration and papers via motif-based heterogeneous networks," Journal of Informetrics, Elsevier, vol. 17(2).
    15. Eleni Fragkiadaki & Georgios Evangelidis, 2016. "Three novel indirect indicators for the assessment of papers and authors based on generations of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 657-694, February.
    16. Peng Bao & Chengxiang Zhai, 2017. "Dynamic credit allocation in scientific literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 595-606, July.
    17. Jianlin Zhou & An Zeng & Ying Fan & Zengru Di, 2016. "Ranking scientific publications with similarity-preferential mechanism," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 805-816, February.
    18. Zhi Li & Qinke Peng & Che Liu, 2016. "Two citation-based indicators to measure latent referential value of papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1299-1313, September.
    19. Nykl, Michal & Campr, Michal & Ježek, Karel, 2015. "Author ranking based on personalized PageRank," Journal of Informetrics, Elsevier, vol. 9(4), pages 777-799.
    20. Xiaomei Bai & Fuli Zhang & Jinzhou Li & Zhong Xu & Zeeshan Patoli & Ivan Lee, 2021. "Quantifying scientific collaboration impact by exploiting collaboration-citation network," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 7993-8008, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:109:y:2016:i:2:d:10.1007_s11192-016-2057-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.