IDEAS home Printed from https://ideas.repec.org/a/oup/rseval/v9y2000i2p125-132.html
   My bibliography  Save this article

Quantitative and qualitative methods and measures in the evaluation of research

Author

Listed:
  • David Roessner

Abstract

This paper proposes that the choice of quantitative versus qualitative measures in research evaluation is a false one, especially for evaluators isolated from the real world. This choice, sometimes substantially client-driven, should be tempered by professional judgment. It is sometimes easier to develop quantitative ‘indicators’ of performance than to work out what the program has to accomplish. As a result legislators and others increasingly ask public agencies for quantitative measures of research performance, and in so doing generate all kinds of mischief. Unfortunately, the fallacy of misplaced concreteness in research evaluation is still alive if not necessarily well. Copyright , Beech Tree Publishing.

Suggested Citation

  • David Roessner, 2000. "Quantitative and qualitative methods and measures in the evaluation of research," Research Evaluation, Oxford University Press, vol. 9(2), pages 125-132, August.
  • Handle: RePEc:oup:rseval:v:9:y:2000:i:2:p:125-132
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.3152/147154400781777296
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sarah Seus & Susanne Bührer, 2022. "The evaluation of the Austrian START programme: an impact analysis of a research funding programme using a multi-method approach," The Journal of Technology Transfer, Springer, vol. 47(3), pages 673-698, June.
    2. Denis O. Gray & Harm-Jan Steenhuis, 2003. "Quantifying the benefits of participating in an industry university research center: An examination of research cost avoidance," Scientometrics, Springer;Akadémiai Kiadó, vol. 58(2), pages 281-300, October.
    3. Sebastian Grauwin & Pablo Jensen, 2011. "Mapping scientific institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 943-954, December.
    4. Matteo Pedrini & Valentina Langella & Mario Alberto Battaglia & Paola Zaratin, 2018. "Assessing the health research’s social impact: a systematic review," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 1227-1250, March.
    5. Mathew Manimala & K. Thomas, 2013. "Learning Needs of Technology Transfer: Coping with Discontinuities and Disruptions," Journal of the Knowledge Economy, Springer;Portland International Center for Management of Engineering and Technology (PICMET), vol. 4(4), pages 511-539, December.
    6. Rafols, Ismael & Leydesdorff, Loet & O’Hare, Alice & Nightingale, Paul & Stirling, Andy, 2012. "How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business & Management," Research Policy, Elsevier, vol. 41(7), pages 1262-1282.
    7. Fu, Junying & Frietsch, Rainer & Tagscherer, Ulrike, 2013. "Publication activity in the Science Citation Index Expanded (SCIE) database in the context of Chinese science and technology policy from 1977 to 2012," Discussion Papers "Innovation Systems and Policy Analysis" 35, Fraunhofer Institute for Systems and Innovation Research (ISI).
    8. Martin Ricker, 2015. "A numerical algorithm with preference statements to evaluate the performance of scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(1), pages 191-212, April.
    9. Peter A. Schulz & Edmilson J. T. Manganote, 2012. "Revisiting country research profiles: learning about the scientific cultures," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(2), pages 517-531, November.
    10. Ismael Rafols & Alan Porter & Loet Leydesdorff, 2009. "Overlay Maps of Science: a New Tool for Research Policy," SPRU Working Paper Series 179, SPRU - Science Policy Research Unit, University of Sussex Business School.
    11. Rafols, Ismael & Stirling, Andy, 2020. "Designing indicators for opening up evaluation. Insights from research assessment," SocArXiv h2fxp, Center for Open Science.
    12. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:rseval:v:9:y:2000:i:2:p:125-132. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/rev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.