IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v115y2018i1d10.1007_s11192-018-2669-y.html
   My bibliography  Save this article

Do citations and readership identify seminal publications?

Author

Listed:
  • Drahomira Herrmannova

    (The Open University
    Oak Ridge National Laboratory)

  • Robert M. Patton

    (Oak Ridge National Laboratory)

  • Petr Knoth

    (The Open University)

  • Christopher G. Stahl

    (Oak Ridge National Laboratory)

Abstract

This work presents a new approach for analysing the ability of existing research metrics to identify research which has strongly influenced future developments. More specifically, we focus on the ability of citation counts and Mendeley reader counts to distinguish between publications regarded as seminal and publications regarded as literature reviews by field experts. The main motivation behind our research is to gain a better understanding of whether and how well the existing research metrics relate to research quality. For this experiment we have created a new dataset which we call TrueImpactDataset and which contains two types of publications, seminal papers and literature reviews. Using the dataset, we conduct a set of experiments to study how citation and reader counts perform in distinguishing these publication types, following the intuition that causing a change in a field signifies research quality. Our research shows that citation counts work better than a random baseline (by a margin of 10%) in distinguishing important seminal research papers from literature reviews while Mendeley reader counts do not work better than the baseline.

Suggested Citation

  • Drahomira Herrmannova & Robert M. Patton & Petr Knoth & Christopher G. Stahl, 2018. "Do citations and readership identify seminal publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 239-262, April.
  • Handle: RePEc:spr:scient:v:115:y:2018:i:1:d:10.1007_s11192-018-2669-y
    DOI: 10.1007/s11192-018-2669-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-018-2669-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-018-2669-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Leydesdorff, Loet, 2015. "Does quality and content matter for citedness? A comparison with para-textual factors and over time," Journal of Informetrics, Elsevier, vol. 9(3), pages 419-429.
    2. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom-cited influences," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    3. Benjamin M. Althouse & Jevin D. West & Carl T. Bergstrom & Theodore Bergstrom, 2009. "Differences in impact factor across fields and over time," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(1), pages 27-34, January.
    4. Marc Bertin & Iana Atanassova & Yves Gingras & Vincent Larivière, 2016. "The invariant distribution of references in scientific articles," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(1), pages 164-177, January.
    5. Mike Thelwall & Pardeep Sud, 2016. "Mendeley readership counts: An investigation of temporal and disciplinary differences," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(12), pages 3036-3050, December.
    6. Xiaojun Wan & Fang Liu, 2014. "Are all literature citations equally important? Automatic citation strength estimation and its applications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(9), pages 1929-1938, September.
    7. Editors The, 2008. "Content," Basic Income Studies, De Gruyter, vol. 2(2), pages 1-2, January.
    8. Anne-Wil Harzing & Satu Alakangas, 2016. "Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 787-804, February.
    9. Anne-Wil Harzing, 2016. "Microsoft Academic (Search): a Phoenix arisen from the ashes?," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1637-1647, September.
    10. Lutz Bornmann & Irina Nast & Hans-Dieter Daniel, 2008. "Do editors and referees look for signs of scientific misconduct when reviewing manuscripts? A quantitative content analysis of studies that examined review criteria and reasons for accepting and rejec," Scientometrics, Springer;Akadémiai Kiadó, vol. 77(3), pages 415-432, December.
    11. Richard Van Noorden & Brendan Maher & Regina Nuzzo, 2014. "The top 100 papers," Nature, Nature, vol. 514(7524), pages 550-553, October.
    12. Editors The, 2008. "Content," Basic Income Studies, De Gruyter, vol. 3(3), pages 1-1, December.
    13. Stefanie Haustein & Isabella Peters & Judit Bar-Ilan & Jason Priem & Hadas Shema & Jens Terliesner, 2014. "Coverage and adoption of altmetrics sources in the bibliometric community," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1145-1163, November.
    14. Nabeil Maflahi & Mike Thelwall, 2016. "When are readership counts as useful as citation counts? Scopus versus Mendeley for LIS journals," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(1), pages 191-199, January.
    15. Xiaodan Zhu & Peter Turney & Daniel Lemire & André Vellino, 2015. "Measuring academic influence: Not all citations are equal," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(2), pages 408-427, February.
    16. Bornmann, Lutz & Haunschild, Robin, 2015. "Which people use which scientific papers? An evaluation of data from F1000 and Mendeley," Journal of Informetrics, Elsevier, vol. 9(3), pages 477-487.
    17. Olgica Nedić & Aleksandar Dekanski, 2016. "Priority criteria in peer review of scientific articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(1), pages 15-26, April.
    18. Bornmann, Lutz, 2014. "Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics," Journal of Informetrics, Elsevier, vol. 8(4), pages 895-903.
    19. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    20. Editors The, 2008. "Content," Basic Income Studies, De Gruyter, vol. 3(1), pages 1-1, July.
    21. Martin Ricker, 2017. "Letter to the Editor: About the quality and impact of scientific articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(3), pages 1851-1855, June.
    22. Ehsan Mohammadi & Mike Thelwall & Kayvan Kousha, 2016. "Can Mendeley bookmarks reflect readership? A survey of user motivations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(5), pages 1198-1209, May.
    23. Natsuo Onodera & Fuyuki Yoshikane, 2015. "Factors affecting citation rates of research articles," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(4), pages 739-764, April.
    24. Lutz Bornmann & Robin Haunschild, 2017. "Does evaluative scientometrics lose its main focus on scientific quality by the new orientation towards societal impact?," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(2), pages 937-943, February.
    25. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Di Costa, 2010. "Citations versus journal impact factor as proxy of quality: could the latter ever be preferable?," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(3), pages 821-833, September.
    26. Ehsan Mohammadi & Mike Thelwall & Stefanie Haustein & Vincent Larivière, 2015. "Who reads research articles? An altmetrics analysis of Mendeley user categories," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(9), pages 1832-1846, September.
    27. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Does the h-index for ranking of scientists really work?," Scientometrics, Springer;Akadémiai Kiadó, vol. 65(3), pages 391-392, December.
    28. Editors The, 2008. "Content," Basic Income Studies, De Gruyter, vol. 3(2), pages 1-1, November.
    29. Dag W Aksnes, 2003. "Characteristics of highly cited papers," Research Evaluation, Oxford University Press, vol. 12(3), pages 159-170, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Mingyang Wang & Zhenyu Wang & Guangsheng Chen, 2019. "Which can better predict the future success of articles? Bibliometric indices or alternative metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1575-1595, June.
    2. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jerome K. Vanclay, 2012. "Impact factor: outdated artefact or stepping-stone to journal certification?," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 211-238, August.
    2. Louis Mesnard, 2010. "On Hochberg et al.’s “The tragedy of the reviewer commons”," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(3), pages 903-917, September.
    3. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    4. Lutz Bornmann & Christophe Weymuth & Hans-Dieter Daniel, 2010. "A content analysis of referees’ comments: how do comments on manuscripts rejected by a high-impact journal and later published in either a low- or high-impact journal differ?," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(2), pages 493-506, May.
    5. Zhiqi Wang & Wolfgang Glänzel & Yue Chen, 2020. "The impact of preprints in Library and Information Science: an analysis of citations, usage and social attention indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1403-1423, November.
    6. McAleer, M.J. & Oláh, J. & Popp, J., 2018. "Pros and Cons of the Impact Factor in a Rapidly Changing Digital World," Econometric Institute Research Papers EI2018-11, Erasmus University Rotterdam, Erasmus School of Economics (ESE), Econometric Institute.
    7. Lutz Bornmann & Markus Wolf & Hans-Dieter Daniel, 2012. "Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 843-856, June.
    8. Olgica Nedić & Aleksandar Dekanski, 2016. "Priority criteria in peer review of scientific articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(1), pages 15-26, April.
    9. Lutz Bornmann, 2013. "Research Misconduct—Definitions, Manifestations and Extent," Publications, MDPI, vol. 1(3), pages 1-12, October.
    10. Pardeep Sud & Mike Thelwall, 2014. "Evaluating altmetrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1131-1143, February.
    11. Embiya Celik & Nuray Gedik & Güler Karaman & Turgay Demirel & Yuksel Goktas, 2014. "Mistakes encountered in manuscripts on education and their effects on journal rejections," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(3), pages 1837-1853, March.
    12. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
    13. Mike Thelwall, 2017. "Are Mendeley reader counts useful impact indicators in all fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1721-1731, December.
    14. Mike Thelwall, 2018. "Differences between journals and years in the proportions of students, researchers and faculty registering Mendeley articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 717-729, May.
    15. Ortega, José Luis, 2018. "The life cycle of altmetric impact: A longitudinal study of six metrics from PlumX," Journal of Informetrics, Elsevier, vol. 12(3), pages 579-589.
    16. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    17. Chetty, Krish & Aneja, Urvashi & Mishra, Vidisha & Gcora, Nozibele & Josie, Jaya, 2018. "Bridging the digital divide in the G20: Skills for the new age," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-20.
    18. SeungGwan Lee & DaeHo Lee, 2018. "A personalized channel recommendation and scheduling system considering both section video clips and full video clips," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-14, July.
    19. Caroline M. Hoxby, 2018. "The Productivity of US Postsecondary Institutions," NBER Chapters, in: Productivity in Higher Education, pages 31-66, National Bureau of Economic Research, Inc.
    20. Xiomara S. Q. Chacon & Thiago C. Silva & Diego R. Amancio, 2020. "Comparing the impact of subfields in scientific journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 625-639, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:115:y:2018:i:1:d:10.1007_s11192-018-2669-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.