IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v125y2020i2d10.1007_s11192-020-03558-7.html
   My bibliography  Save this article

How accurate are policy document mentions? A first look at the role of altmetrics database

Author

Listed:
  • Houqiang Yu

    (Nanjing University of Science and Technology)

  • Xueting Cao

    (Nanjing University of Science and Technology)

  • Tingting Xiao

    (Nanjing Library)

  • Zhenyi Yang

    (Nanjing University of Science and Technology)

Abstract

Policy document mention is considered to indicate the significance and societal impact of scientific product. However, the accuracy of policy document altmetrics data needs to be evaluated to fully understand its strength and limitation. An in-depth coding analysis was conducted on sample policy documents records of Altmetric.com database. The sample consists of 2079 records from all 79 distinct policy document source platforms tracked by the database. Errors about mentioned publications in the policy documents (type A error) are found in 8% of the records, while errors about either the recorded policy documents or the mentioned publications in the altmetrics database (type B error) are found in 70% of the records. In type B error, policy document link error (5% of the records) could be attributable to the policy document website, transcription error (52% of the records) could be attributable to the third-party bibliographic data provider. These two categories of error are relatively minor and may have limited influence on altmetrics research and practices. False positive policy document mention (13% of the records), however, could be attributable to the Altmetric database and may diminish the validity of research based on the policy document altmetrics data. The underlying reasons remain to be further investigated. Considering the high complexity of extracting mentions of publications from various sources and formats of policy documents as well as its short history, Altmetric database has achieved excellent performance.

Suggested Citation

  • Houqiang Yu & Xueting Cao & Tingting Xiao & Zhenyi Yang, 2020. "How accurate are policy document mentions? A first look at the role of altmetrics database," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1517-1540, November.
  • Handle: RePEc:spr:scient:v:125:y:2020:i:2:d:10.1007_s11192-020-03558-7
    DOI: 10.1007/s11192-020-03558-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-020-03558-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-020-03558-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Houqiang Yu, 2017. "Context of altmetrics data matters: an investigation of count type and user category," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 267-283, April.
    2. Wang, Qi & Waltman, Ludo, 2016. "Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 347-364.
    3. José Luis Ortega, 2018. "Reliability and accuracy of altmetric providers: a comparison among Altmetric.com, PlumX and Crossref Event Data," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2123-2138, September.
    4. Anne-Wil Harzing & Satu Alakangas, 2016. "Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 787-804, February.
    5. Zhichao Fang & Rodrigo Costas, 2020. "Studying the accumulation velocity of altmetric data tracked by Altmetric.com," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(2), pages 1077-1101, May.
    6. Christine Meschede & Tobias Siebenlist, 2018. "Cross-metric compatability and inconsistencies of altmetrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 283-297, April.
    7. Éric Archambault & David Campbell & Yves Gingras & Vincent Larivière, 2009. "Comparing bibliometric statistics obtained from the Web of Science and Scopus," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(7), pages 1320-1326, July.
    8. Bornmann, Lutz & Haunschild, Robin, 2015. "Which people use which scientific papers? An evaluation of data from F1000 and Mendeley," Journal of Informetrics, Elsevier, vol. 9(3), pages 477-487.
    9. Lutz Bornmann & Robin Haunschild & Werner Marx, 2016. "Policy documents as sources for measuring societal impact: how often is climate change research mentioned in policy-related documents?," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1477-1495, December.
    10. Hanan Khazragui & John Hudson, 2015. "Measuring the benefits of university research: impact and the REF in the UK," Research Evaluation, Oxford University Press, vol. 24(1), pages 51-62.
    11. Lokman I. Meho & Kiduk Yang, 2007. "Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(13), pages 2105-2125, November.
    12. Fiorenzo Franceschini & Domenico Maisano & Luca Mastrogiacomo, 2016. "Do Scopus and WoS correct “old” omitted citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(2), pages 321-335, May.
    13. Paul Donner, 2017. "Document type assignment accuracy in the journal citation index data of Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 219-236, October.
    14. Ad A.M. Prins & Rodrigo Costas & Thed N. van Leeuwen & Paul F. Wouters, 2016. "Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data," Research Evaluation, Oxford University Press, vol. 25(3), pages 264-270.
    15. Zahedi, Zohreh & Haustein, Stefanie, 2018. "On the relationships between bibliographic characteristics of scientific documents and citation and Mendeley readership counts: A large-scale analysis of Web of Science publications," Journal of Informetrics, Elsevier, vol. 12(1), pages 191-202.
    16. Robin Haunschild & Lutz Bornmann, 2017. "How many scientific papers are mentioned in policy-related documents? An empirical investigation using Web of Science and Altmetric data," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1209-1216, March.
    17. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2016. "The museum of errors/horrors in Scopus," Journal of Informetrics, Elsevier, vol. 10(1), pages 174-182.
    18. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2014. "Scientific journal publishers and omitted citations in bibliometric databases: Any relationship?," Journal of Informetrics, Elsevier, vol. 8(3), pages 751-765.
    19. Kuku Joseph Aduku & Mike Thelwall & Kayvan Kousha, 2017. "Do Mendeley reader counts reflect the scholarly impact of conference papers? An investigation of computer science and engineering," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 573-581, July.
    20. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2016. "Empirical analysis and classification of database errors in Scopus and Web of Science," Journal of Informetrics, Elsevier, vol. 10(4), pages 933-953.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Houqiang Yu & Biegzat Murat & Longfei Li & Tingting Xiao, 2021. "How accurate are Twitter and Facebook altmetrics data? A comparative content analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 4437-4463, May.
    2. Houqiang Yu & Xinyun Yu & Xueting Cao, 2022. "How accurate are news mentions of scholarly output? A content analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(7), pages 4075-4096, July.
    3. Zhihong Huang & Qianjin Zong & Xuerui Ji, 2022. "The associations between scientific collaborations of LIS research and its policy impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6453-6470, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Shirley Ainsworth & Jane M. Russell, 2018. "Has hosting on science direct improved the visibility of Latin American scholarly journals? A preliminary analysis of data quality," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1463-1484, June.
    2. Franceschini, Fiorenzo & Maisano, Domenico & Mastrogiacomo, Luca, 2016. "Empirical analysis and classification of database errors in Scopus and Web of Science," Journal of Informetrics, Elsevier, vol. 10(4), pages 933-953.
    3. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    4. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    5. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    6. Sergio Copiello, 2019. "The open access citation premium may depend on the openness and inclusiveness of the indexing database, but the relationship is controversial because it is ambiguous where the open access boundary lie," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(2), pages 995-1018, November.
    7. Bornmann, Lutz & Haunschild, Robin & Adams, Jonathan, 2019. "Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK research excellence framework (REF)," Journal of Informetrics, Elsevier, vol. 13(1), pages 325-340.
    8. Michael Gusenbauer, 2022. "Search where you will find most: Comparing the disciplinary coverage of 56 bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2683-2745, May.
    9. Zhichao Fang & Rodrigo Costas & Wencan Tian & Xianwen Wang & Paul Wouters, 2020. "An extensive analysis of the presence of altmetric data for Web of Science publications across subject fields and research topics," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(3), pages 2519-2549, September.
    10. Houqiang Yu & Xinyun Yu & Xueting Cao, 2022. "How accurate are news mentions of scholarly output? A content analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(7), pages 4075-4096, July.
    11. Martín-Martín, Alberto & Orduna-Malea, Enrique & Thelwall, Mike & Delgado López-Cózar, Emilio, 2018. "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories," Journal of Informetrics, Elsevier, vol. 12(4), pages 1160-1177.
    12. Martin-Martin, Alberto & Orduna-Malea, Enrique & Harzing, Anne-Wil & Delgado López-Cózar, Emilio, 2017. "Can we use Google Scholar to identify highly-cited documents?," Journal of Informetrics, Elsevier, vol. 11(1), pages 152-163.
    13. Abdelghani Maddi & Lesya Baudoin, 2022. "The quality of the web of science data: a longitudinal study on the completeness of authors-addresses links," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6279-6292, November.
    14. Gerson Pech & Catarina Delgado, 2020. "Percentile and stochastic-based approach to the comparison of the number of citations of articles indexed in different bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 223-252, April.
    15. Maor Weinberger & Maayan Zhitomirsky-Geffet, 2021. "Diversity of success: measuring the scholarly performance diversity of tenured professors in the Israeli academia," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 2931-2970, April.
    16. Tanya Araújo & Elsa Fontainha, 2018. "Are scientific memes inherited differently from gendered authorship?," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(2), pages 953-972, November.
    17. Gerson Pech & Catarina Delgado, 2020. "Assessing the publication impact using citation data from both Scopus and WoS databases: an approach validated in 15 research fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 909-924, November.
    18. António Correia & Hugo Paredes & Benjamim Fonseca, 2018. "Scientometric analysis of scientific publications in CSCW," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 31-89, January.
    19. Shir Aviv-Reuven & Ariel Rosenfeld, 2023. "A logical set theory approach to journal subject classification analysis: intra-system irregularities and inter-system discrepancies in Web of Science and Scopus," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 157-175, January.
    20. Saïd Echchakoui, 2020. "Why and how to merge Scopus and Web of Science during bibliometric analysis: the case of sales force literature from 1912 to 2019," Journal of Marketing Analytics, Palgrave Macmillan, vol. 8(3), pages 165-184, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:125:y:2020:i:2:d:10.1007_s11192-020-03558-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.