IDEAS home Printed from https://ideas.repec.org/a/gam/jpubli/v8y2020i2p18-d339970.html
   My bibliography  Save this article

Three Commonly Utilized Scholarly Databases and a Social Network Site Provide Different, But Related, Metrics of Pharmacy Faculty Publication

Author

Listed:
  • Kyle J. Burghardt

    (Wayne State University Eugene Applebaum College of Pharmacy and Health Sciences, 259 Mack Avenue, Suite 2190, Detroit, MI 48202, USA)

  • Bradley H. Howlett

    (Wayne State University Eugene Applebaum College of Pharmacy and Health Sciences, 259 Mack Avenue, Suite 2190, Detroit, MI 48202, USA)

  • Audrey S. Khoury

    (Wayne State University Eugene Applebaum College of Pharmacy and Health Sciences, 259 Mack Avenue, Suite 2190, Detroit, MI 48202, USA)

  • Stephanie M. Fern

    (Wayne State University Eugene Applebaum College of Pharmacy and Health Sciences, 259 Mack Avenue, Suite 2190, Detroit, MI 48202, USA)

  • Paul R. Burghardt

    (Wayne State University Nutrition and Food Sciences, Science Hall, 5045 Cass Ave. Detroit, MI 48202, USA)

Abstract

Scholarly productivity is a critical component of pharmacy faculty effort and is used for promotion and tenure decisions. Several databases are available to measure scholarly productivity; however, comparisons amongst these databases are lacking for pharmacy faculty. The objective of this work was to compare scholarly metrics from three commonly utilized databases and a social networking site focused on data from research-intensive colleges of pharmacy and to identify factors associated with database differences. Scholarly metrics were obtained from Scopus, Web of Science, Google Scholar, and ResearchGate for faculty from research-intensive (Carnegie Rated R1, R2, or special focus) United States pharmacy schools with at least two million USD in funding from the National Institutes of Health. Metrics were compared and correlations were performed. Regression analyses were utilized to identify factors associated with database differences. Significant differences in scholarly metric values were observed between databases despite the high correlations, suggestive of systematic variation in database reporting. Time since first publication was the most common factor that was associated with database differences. Google Scholar tended to have higher metrics than all other databases, while Web of Science had lower metrics relative to other databases. Differences in reported metrics between databases are apparent, which may be attributable to the time since first publication and database coverage of pharmacy-specific journals. These differences should be considered by faculty, reviewers, and administrative staff when evaluating scholarly performance.

Suggested Citation

  • Kyle J. Burghardt & Bradley H. Howlett & Audrey S. Khoury & Stephanie M. Fern & Paul R. Burghardt, 2020. "Three Commonly Utilized Scholarly Databases and a Social Network Site Provide Different, But Related, Metrics of Pharmacy Faculty Publication," Publications, MDPI, vol. 8(2), pages 1-10, April.
  • Handle: RePEc:gam:jpubli:v:8:y:2020:i:2:p:18-:d:339970
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2304-6775/8/2/18/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2304-6775/8/2/18/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Hamid R. Jamali, 2017. "Copyright compliance and infringement in ResearchGate full-text journal articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 241-254, July.
    2. Christian Pieter Hoffmann & Christoph Lutz & Miriam Meckel, 2016. "A relational altmetric? Network centrality on ResearchGate as an indicator of scientific impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(4), pages 765-775, April.
    3. Anne-Wil Harzing & Satu Alakangas, 2016. "Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 787-804, February.
    4. Pedro Albarrán & Juan A. Crespo & Ignacio Ortuño & Javier Ruiz-Castillo, 2011. "The skewness of science in 219 sub-fields and a number of aggregates," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(2), pages 385-397, August.
    5. Michael Gusenbauer, 2019. "Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 177-214, January.
    6. Anne-Wil Harzing, 2019. "Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 341-349, July.
    7. Martín-Martín, Alberto & Orduna-Malea, Enrique & Thelwall, Mike & Delgado López-Cózar, Emilio, 2018. "Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories," Journal of Informetrics, Elsevier, vol. 12(4), pages 1160-1177.
    8. Katrin Weller, 2015. "Social Media and Altmetrics: An Overview of Current Alternative Approaches to Measuring Scholarly Impact," Springer Books, in: Isabell M. Welpe & Jutta Wollersheim & Stefanie Ringelhan & Margit Osterloh (ed.), Incentives and Performance, edition 127, pages 261-276, Springer.
    9. Adam Dinsmore & Liz Allen & Kevin Dolby, 2014. "Alternative Perspectives on Impact: The Potential of ALMs and Altmetrics to Inform Funders about Research Impact," PLOS Biology, Public Library of Science, vol. 12(11), pages 1-4, November.
    10. Björn Hammarfelt & Alexander D. Rushforth, 2017. "Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation," Research Evaluation, Oxford University Press, vol. 26(3), pages 169-180.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Michael Gusenbauer, 2022. "Search where you will find most: Comparing the disciplinary coverage of 56 bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2683-2745, May.
    2. Vivek Kumar Singh & Satya Swarup Srichandan & Hiran H. Lathabai, 2022. "ResearchGate and Google Scholar: how much do they differ in publications, citations and different metrics and why?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(3), pages 1515-1542, March.
    3. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    4. Vivek Kumar Singh & Prashasti Singh & Mousumi Karmakar & Jacqueline Leta & Philipp Mayr, 2021. "The journal coverage of Web of Science, Scopus and Dimensions: A comparative analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 5113-5142, June.
    5. Gerson Pech & Catarina Delgado, 2020. "Assessing the publication impact using citation data from both Scopus and WoS databases: an approach validated in 15 research fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 909-924, November.
    6. Steve J. Bickley & Ho Fai Chan & Benno Torgler, 2022. "Artificial intelligence in the field of economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 2055-2084, April.
    7. Alberto Martín-Martín & Mike Thelwall & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2021. "Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 871-906, January.
    8. Cristina López-Duarte & Jane F. Maley & Marta M. Vidal-Suárez, 2021. "Main challenges to international student mobility in the European arena," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(11), pages 8957-8980, November.
    9. Zhentao Liang & Jin Mao & Kun Lu & Gang Li, 2021. "Finding citations for PubMed: a large-scale comparison between five freely available bibliographic data sources," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9519-9542, December.
    10. Weisheng Chiu & Thomas Chun Man Fan & Sang-Back Nam & Ping-Hung Sun, 2021. "Knowledge Mapping and Sustainable Development of eSports Research: A Bibliometric and Visualized Analysis," Sustainability, MDPI, vol. 13(18), pages 1-17, September.
    11. Sergio Copiello, 2019. "Research Interest: another undisclosed (and redundant) algorithm by ResearchGate," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 351-360, July.
    12. Sergio Copiello, 2019. "The open access citation premium may depend on the openness and inclusiveness of the indexing database, but the relationship is controversial because it is ambiguous where the open access boundary lie," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(2), pages 995-1018, November.
    13. Paúl Carrión-Mero & Néstor Montalván-Burbano & Fernando Morante-Carballo & Adolfo Quesada-Román & Boris Apolo-Masache, 2021. "Worldwide Research Trends in Landslide Science," IJERPH, MDPI, vol. 18(18), pages 1-24, September.
    14. Cristina López-Duarte & Marta M. Vidal-Suárez & Belén González-Díaz, 2018. "The early adulthood of the Asia Pacific Journal of Management: A literature review 2005–2014," Asia Pacific Journal of Management, Springer, vol. 35(2), pages 313-345, June.
    15. Anne-Wil Harzing, 2019. "Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 341-349, July.
    16. Lina M. Cortés & Andrés Mora-Valencia & Javier Perote, 2016. "The productivity of top researchers: a semi-nonparametric approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 891-915, November.
    17. Łukasz Wiechetek & Zbigniew Pastuszak, 2022. "Academic social networks metrics: an effective indicator for university performance?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(3), pages 1381-1401, March.
    18. Sergio Copiello & Pietro Bonifaci, 2019. "ResearchGate Score, full-text research items, and full-text reads: a follow-up study," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1255-1262, May.
    19. Gerson Pech & Catarina Delgado, 2020. "Percentile and stochastic-based approach to the comparison of the number of citations of articles indexed in different bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 223-252, April.
    20. Gordana Budimir & Sophia Rahimeh & Sameh Tamimi & Primož Južnič, 2021. "Comparison of self-citation patterns in WoS and Scopus databases based on national scientific production in Slovenia (1996–2020)," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(3), pages 2249-2267, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jpubli:v:8:y:2020:i:2:p:18-:d:339970. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.