IDEAS home Printed from https://ideas.repec.org/a/gam/jsusta/v14y2022i5p3034-d764528.html
   My bibliography  Save this article

What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives

Author

Listed:
  • Alessandro Margherita

    (Department of Engineering for Innovation, University of Salento, 73100 Lecce, Italy)

  • Gianluca Elia

    (Department of Engineering for Innovation, University of Salento, 73100 Lecce, Italy)

  • Claudio Petti

    (Department of Engineering for Innovation, University of Salento, 73100 Lecce, Italy)

Abstract

The strategic relevance of innovation and scientific research has amplified the attention towards the definition of quality in research practice. However, despite the proliferation of evaluation metrics and procedures, there is a need to go beyond bibliometric approaches and to identify, more explicitly, what constitutes good research and which are its driving factors or determinants. This article reviews specialized research policy, science policy and scientometrics literature to extract critical dimensions associated with research quality as presented in a vast although fragmented theory background. A literature-derived framework of research quality attributes is, thus, obtained, which is subject to an expert feedback process, involving scholars and practitioners in the fields of research policy and evaluation. The results are represented by a structured taxonomy of 66 quality attributes providing a systemic definition of research quality. The attributes are aggregated into a three-dimensional framework encompassing research design (ex ante), research process (in-process) and research impact (ex post) perspectives. The main value of the study is to propose a literature-derived and comprehensive inventory of quality attributes and perspectives of evaluation. The findings can support further theoretical developments and research policy discussions on the ultimate drivers of quality and impact of scientific research. The framework can be also useful to design new exercises or procedures of research evaluation based on a multidimensional view of quality.

Suggested Citation

  • Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
  • Handle: RePEc:gam:jsusta:v:14:y:2022:i:5:p:3034-:d:764528
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2071-1050/14/5/3034/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2071-1050/14/5/3034/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Bras-Amorós, Maria & Domingo-Ferrer, Josep & Torra, Vicenç, 2011. "A bibliometric index based on the collaboration distance between cited and citing authors," Journal of Informetrics, Elsevier, vol. 5(2), pages 248-264.
    2. Lucien Karpik, 2010. "Valuing the Unique: The Economics of Singularities," Economics Books, Princeton University Press, edition 1, number 9215.
    3. Michael Ochsner & Sven E. Hug & Hans-Dieter Daniel, 2012. "Four types of research in the humanities: Setting the stage for research quality criteria in the humanities," Research Evaluation, Oxford University Press, vol. 22(2), pages 79-92, November.
    4. McNie, Elizabeth C. & Parris, Adam & Sarewitz, Daniel, 2016. "Improving the public value of science: A typology to inform discussion, design and implementation of research," Research Policy, Elsevier, vol. 45(4), pages 884-895.
    5. Lutz Bornmann, 2013. "What is societal impact of research and how can it be assessed? a literature survey," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 217-233, February.
    6. Henk F Moed, 2007. "The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review," Science and Public Policy, Oxford University Press, vol. 34(8), pages 575-583, October.
    7. Claire Donovan, 2007. "The qualitative future of research evaluation," Science and Public Policy, Oxford University Press, vol. 34(8), pages 585-597, October.
    8. Amin, Ash & Roberts, Joanne, 2008. "Knowing in action: Beyond communities of practice," Research Policy, Elsevier, vol. 37(2), pages 353-369, March.
    9. Barend van der Meulen & Arie Rip, 2000. "Evaluation of societal quality of public sector research in the Netherlands," Research Evaluation, Oxford University Press, vol. 9(1), pages 11-25, April.
    10. Rinze Benedictus & Frank Miedema & Mark W. J. Ferguson, 2016. "Fewer numbers, better science," Nature, Nature, vol. 538(7626), pages 453-455, October.
    11. Linda Butler, 2007. "Assessing university research: A plea for a balanced approach," Science and Public Policy, Oxford University Press, vol. 34(8), pages 565-574, October.
    12. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    13. Sven E. Hug & Michael Ochsner & Hans-Dieter Daniel, 2013. "Criteria for assessing research quality in the humanities: a Delphi study among scholars of English literature, German literature and art history," Research Evaluation, Oxford University Press, vol. 22(5), pages 369-383, August.
    14. S. Alonso & F. J. Cabrerizo & E. Herrera-Viedma & F. Herrera, 2010. "hg-index: a new index to characterize the scientific output of researchers based on the h- and g-indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(2), pages 391-400, February.
    15. Gabrielle N. Samuel & Gemma E. Derrick, 2015. "Societal impact evaluation: Exploring evaluator perceptions of the characterization of impact under the REF2014," Research Evaluation, Oxford University Press, vol. 24(3), pages 229-241.
    16. Bornmann, Lutz, 2014. "Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics," Journal of Informetrics, Elsevier, vol. 8(4), pages 895-903.
    17. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    18. J Britt Holbrook & Robert Frodeman, 2011. "Peer review and the ex ante assessment of societal impacts," Research Evaluation, Oxford University Press, vol. 20(3), pages 239-246, September.
    19. Mårtensson, Pär & Fors, Uno & Wallin, Sven-Bertil & Zander, Udo & Nilsson, Gunnar H, 2016. "Evaluating research: A multidisciplinary approach to assessing research practice and quality," Research Policy, Elsevier, vol. 45(3), pages 593-603.
    20. Sarah de Rijcke & Paul F. Wouters & Alex D. Rushforth & Thomas P. Franssen & Björn Hammarfelt, 2016. "Evaluation practices and effects of indicator use—a literature review," Research Evaluation, Oxford University Press, vol. 25(2), pages 161-169.
    21. Ad A.M. Prins & Rodrigo Costas & Thed N. van Leeuwen & Paul F. Wouters, 2016. "Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data," Research Evaluation, Oxford University Press, vol. 25(3), pages 264-270.
    22. Emanuela Reale & Dragana Avramov & Kubra Canhial & Claire Donovan & Ramon Flecha & Poul Holm & Charles Larkin & Benedetto Lepori & Judith Mosoni-Fried & Esther Oliver & Emilia Primeri & Lidia Puigvert, 2018. "A review of literature on evaluating the scientific, social and political impact of social sciences and humanities research," Research Evaluation, Oxford University Press, vol. 27(4), pages 298-308.
    23. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    24. Björn Hammarfelt & Alexander D. Rushforth, 2017. "Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation," Research Evaluation, Oxford University Press, vol. 26(3), pages 169-180.
    25. Sven Hemlin, 1998. "Utility evaluation of academic research: six basic propositions," Research Evaluation, Oxford University Press, vol. 7(3), pages 159-165, December.
    26. J. Britt Holbrook & Kelli R. Barr & Keith Wayne Brown, 2013. "We need negative metrics too," Nature, Nature, vol. 497(7450), pages 439-439, May.
    27. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    2. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    3. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    4. David A. Pendlebury, 2019. "Charting a path between the simple and the false and the complex and unusable: Review of Henk F. Moed, Applied Evaluative Informetrics [in the series Qualitative and Quantitative Analysis of Scientifi," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 549-560, April.
    5. Anne K. Krüger, 2020. "Quantification 2.0? Bibliometric Infrastructures in Academic Evaluation," Politics and Governance, Cogitatio Press, vol. 8(2), pages 58-67.
    6. Andrea Bonaccorsi & Filippo Chiarello & Gualtiero Fantoni, 2021. "Impact for whom? Mapping the users of public research with lexicon-based text mining," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1745-1774, February.
    7. Nadia Simoes & Nuno Crespo, 2020. "A flexible approach for measuring author-level publishing performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 331-355, January.
    8. Dotti, Nicola Francesco & Walczyk, Julia, 2022. "What is the societal impact of university research? A policy-oriented review to map approaches, identify monitoring methods and success factors," Evaluation and Program Planning, Elsevier, vol. 95(C).
    9. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    10. Matteo Pedrini & Valentina Langella & Mario Alberto Battaglia & Paola Zaratin, 2018. "Assessing the health research’s social impact: a systematic review," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 1227-1250, March.
    11. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    12. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    13. Jorrit P Smit & Laurens K Hessels, 2021. "The production of scientific and societal value in research evaluation: a review of societal impact assessment methods [Systems Thinking, Knowledge and Action: Towards Better Models and Methods]," Research Evaluation, Oxford University Press, vol. 30(3), pages 323-335.
    14. Osterloh, Margit & Frey, Bruno S., 2020. "How to avoid borrowed plumes in academia," Research Policy, Elsevier, vol. 49(1).
    15. Ramón A. Feenstra & Emilio Delgado López-Cózar, 2022. "Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 2085-2103, April.
    16. Zheng Yan & Wenqian Robertson & Yaosheng Lou & Tom W. Robertson & Sung Yong Park, 2021. "Finding leading scholars in mobile phone behavior: a mixed-method analysis of an emerging interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9499-9517, December.
    17. Zhenbin Yan & Qiang Wu & Xingchen Li, 2016. "Do Hirsch-type indices behave the same in assessing single publications? An empirical study of 29 bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1815-1833, December.
    18. Lutz Bornmann, 2013. "What is societal impact of research and how can it be assessed? a literature survey," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 217-233, February.
    19. Yurij L. Katchanov & Yulia V. Markova, 2017. "The “space of physics journals”: topological structure and the Journal Impact Factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 313-333, October.
    20. Shahzad, Murtuza & Alhoori, Hamed & Freedman, Reva & Rahman, Shaikh Abdul, 2022. "Quantifying the online long-term interest in research," Journal of Informetrics, Elsevier, vol. 16(2).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jsusta:v:14:y:2022:i:5:p:3034-:d:764528. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.