IDEAS home Printed from https://ideas.repec.org/a/spr/reihed/v64y2023i7d10.1007_s11162-023-09735-w.html
   My bibliography  Save this article

Use of Latent Profile Analysis to Model the Translation of University Research into Health Practice and Policy: Exploration of Proposed Metrics

Author

Listed:
  • Marlo M. Vernon

    (Georgia Cancer Center, Augusta University)

  • Frances M. Yang

    (University of Kansas Medical Center)

Abstract

The aim of this study is to profile academic institutions (n = 127) based on publications, citations in the top 10% of journals, patent citations in Food and Drug Administration (FDA) approvals, clinical trials with uploaded results, contributions to clinical practice guidelines, awarded patents, start-ups, and licenses generating income in response to the Association of University Technology Managers (AUTM) Licensing Activity Survey: Fiscal Years 2011–2015. Latent variable modeling (LVM) was conducted in Mplus v.8.1, specifically latent profile analysis (LPA) was utilized to predict institutional profiles of research, which were compared with the 2015 Carnegie Classification System ranks. Multivariate regression of profile assignment on research expenditure and income generated by licensure was used to show concurrent validity. The LPA resulted in three profiles as the most parsimonious model. Mantel-Haenszel test of trend to the Carnegie Classification found a positive and significant association among institution rankings (r = 0.492, χ2(1) = 26.69, p

Suggested Citation

  • Marlo M. Vernon & Frances M. Yang, 2023. "Use of Latent Profile Analysis to Model the Translation of University Research into Health Practice and Policy: Exploration of Proposed Metrics," Research in Higher Education, Springer;Association for Institutional Research, vol. 64(7), pages 1058-1070, November.
  • Handle: RePEc:spr:reihed:v:64:y:2023:i:7:d:10.1007_s11162-023-09735-w
    DOI: 10.1007/s11162-023-09735-w
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11162-023-09735-w
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11162-023-09735-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lutz Bornmann & Rüdiger Mutz & Hans‐Dieter Daniel, 2008. "Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(5), pages 830-837, March.
    2. Lutz Bornmann, 2013. "What is societal impact of research and how can it be assessed? a literature survey," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 217-233, February.
    3. Marlo M Vernon & E Andrew Balas & Shaher Momani, 2018. "Are university rankings useful to improve research? A systematic review," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-15, March.
    4. Carolyn J. Heinrich & Gerald Marschke, 2010. "Incentives and their dynamics in public sector performance management systems," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 29(1), pages 183-208.
    5. Isidro F. Aguillo & Judit Bar-Ilan & Mark Levene & José Luis Ortega, 2010. "Comparing university rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(1), pages 243-256, October.
    6. Hicks, Diana, 2012. "Performance-based university research funding systems," Research Policy, Elsevier, vol. 41(2), pages 251-261.
    7. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile-based bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    8. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    9. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    10. Stanley Sclove, 1987. "Application of model-selection criteria to some problems in multivariate analysis," Psychometrika, Springer;The Psychometric Society, vol. 52(3), pages 333-343, September.
    11. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile‐based bibliometric indicators," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    12. Jean O. Lanjouw & Mark Schankerman, 2004. "Patent Quality and Research Productivity: Measuring Innovation with Multiple Indicators," Economic Journal, Royal Economic Society, vol. 114(495), pages 441-465, April.
    13. Janet Bercovitz & Maryann Feldman, 2006. "Entpreprenerial Universities and Technology Transfer: A Conceptual Framework for Understanding Knowledge-Based Economic Development," The Journal of Technology Transfer, Springer, vol. 31(1), pages 175-188, January.
    14. Rasmussen, Einar & Borch, Odd Jarl, 2010. "University capabilities in facilitating entrepreneurship: A longitudinal study of spin-off ventures at mid-range universities," Research Policy, Elsevier, vol. 39(5), pages 602-612, June.
    15. Francis S. Collins & Lawrence A. Tabak, 2014. "Policy: NIH plans to enhance reproducibility," Nature, Nature, vol. 505(7485), pages 612-613, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Marlo M. Vernon & C. Makenzie Danley & Frances M. Yang, 2021. "Developing a measure of innovation from research in higher education data," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 3919-3928, May.
    2. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    3. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    4. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    5. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    6. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.
    7. Mike Thelwall, 2019. "The influence of highly cited papers on field normalised indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 519-537, February.
    8. Benedetto Lepori & Aldo Geuna & Antonietta Mira, 2019. "Scientific output scales with resources. A comparison of US and European universities," PLOS ONE, Public Library of Science, vol. 14(10), pages 1-18, October.
    9. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    10. Pech, Gerson & Delgado, Catarina, 2021. "Screening the most highly cited papers in longitudinal bibliometric studies and systematic literature reviews of a research field or journal: Widespread used metrics vs a percentile citation-based app," Journal of Informetrics, Elsevier, vol. 15(3).
    11. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    12. Qi Wang & Tobias Jeppsson, 2022. "Identifying benchmark units for research management and evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7557-7574, December.
    13. Alonso Rodríguez-Navarro & Ricardo Brito, 2019. "Probability and expected frequency of breakthroughs: basis and use of a robust method of research assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 213-235, April.
    14. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    15. Sten F Odenwald, 2020. "A citation study of earth science projects in citizen science," PLOS ONE, Public Library of Science, vol. 15(7), pages 1-26, July.
    16. Catalina Martínez & Valerio Sterzi, 2021. "The impact of the abolishment of the professor’s privilege on European university-owned patents," Industry and Innovation, Taylor & Francis Journals, vol. 28(3), pages 247-282, March.
    17. Stephen, Dimity & Stahlschmidt, Stephan, 2021. "Performance and structures of the German science system 2021," Studien zum deutschen Innovationssystem 5-2021, Expertenkommission Forschung und Innovation (EFI) - Commission of Experts for Research and Innovation, Berlin.
    18. Manta Eduard Mihai & Davidescu Adriana Ana Maria & Geambasu Maria Cristina & Florescu Margareta Stela, 2023. "Exploring the research area of direct taxation. An empirical analysis based on bibliometric analysis results," Management & Marketing, Sciendo, vol. 18(s1), pages 355-383, December.
    19. Wildgaard, Lorna, 2016. "A critical cluster analysis of 44 indicators of author-level performance," Journal of Informetrics, Elsevier, vol. 10(4), pages 1055-1078.
    20. Corrêa Jr., Edilson A. & Silva, Filipi N. & da F. Costa, Luciano & Amancio, Diego R., 2017. "Patterns of authors contribution in scientific manuscripts," Journal of Informetrics, Elsevier, vol. 11(2), pages 498-510.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:reihed:v:64:y:2023:i:7:d:10.1007_s11162-023-09735-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.