IDEAS home Printed from https://ideas.repec.org/h/elg/eechap/14384_5.html
   My bibliography  Save this book chapter

Peer review and expert panels as techniques for evaluating the quality of academic research

In: Handbook on the Theory and Practice of Program Evaluation

Author

Listed:
  • Irwin Feller

Abstract

As this volume demonstrates, a wide variety of methodologies exist to evaluate particularly the objectives and outcomes of research and development programs. These include surveys, statistical and econometric estimations, patent analyses, bibliometrics, scientometrics, network analyses, case studies, and historical tracings. Contributors divide these and other methods and applications into four categories – economic, non-economic, hybrid and data-driven – in order to discuss the many factors that affect the utility of each technique and how that impacts the technological, economic and societal forecasts of the programs in question.

Suggested Citation

  • Irwin Feller, 2013. "Peer review and expert panels as techniques for evaluating the quality of academic research," Chapters, in: Albert N. Link & Nicholas S. Vonortas (ed.), Handbook on the Theory and Practice of Program Evaluation, chapter 5, pages 115-142, Edward Elgar Publishing.
  • Handle: RePEc:elg:eechap:14384_5
    as

    Download full text from publisher

    File URL: https://www.elgaronline.com/view/9780857932396.00011.xml
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes," PLOS ONE, Public Library of Science, vol. 3(10), pages 1-11, October.
    2. Irwin Feller, 2006. "Multiple actors, multiple settings, multiple criteria: issues in assessing interdisciplinary research," Research Evaluation, Oxford University Press, vol. 15(1), pages 5-15, April.
    3. Finn Hansson, 2010. "Dialogue in or with the peer review? Evaluating research organizations in order to promote organizational learning," Science and Public Policy, Oxford University Press, vol. 37(4), pages 239-251, May.
    4. Thomas Heinze, 2008. "How to sponsor ground-breaking research: A comparison of funding schemes," Science and Public Policy, Oxford University Press, vol. 35(5), pages 302-318, June.
    5. Smith, Simon & Ward, Vicky & House, Allan, 2011. "‘Impact’ in the proposals for the UK's Research Excellence Framework: Shifting the boundaries of academic autonomy," Research Policy, Elsevier, vol. 40(10), pages 1369-1379.
    6. Grit Laudel, 2006. "Conclave in the Tower of Babel: how peers review interdisciplinary research proposals," Research Evaluation, Oxford University Press, vol. 15(1), pages 57-68, April.
    7. Veronica Boix Mansilla & Irwin Feller & Howard Gardner, 2006. "Quality assessment in interdisciplinary research and education," Research Evaluation, Oxford University Press, vol. 15(1), pages 69-74, April.
    8. Martin Reinhart, 2010. "Peer review practices: a content analysis of external reviews in science funding," Research Evaluation, Oxford University Press, vol. 19(5), pages 317-331, December.
    9. Adam B. Jaffe, 2002. "Building Programme Evaluation into the Design of Public Research-Support Programmes," Oxford Review of Economic Policy, Oxford University Press, vol. 18(1), pages 22-34, Spring.
    10. David Paul A., 2008. "The Historical Origins of 'Open Science': An Essay on Patronage, Reputation and Common Agency Contracting in the Scientific Revolution," Capitalism and Society, De Gruyter, vol. 3(2), pages 1-106, October.
    11. William Bonvillian & Richard Atta, 2011. "ARPA-E and DARPA: Applying the DARPA model to energy innovation," The Journal of Technology Transfer, Springer, vol. 36(5), pages 469-513, October.
    12. J Britt Holbrook & Robert Frodeman, 2011. "Peer review and the ex ante assessment of societal impacts," Research Evaluation, Oxford University Press, vol. 20(3), pages 239-246, September.
    13. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2007. "Gender differences in grant peer review: A meta-analysis," Journal of Informetrics, Elsevier, vol. 1(3), pages 226-238.
    14. Peter van den Besselaar & Loet Leydesdorff, 2009. "Past performance, peer review and project selection: a case study in the social and behavioral sciences," Research Evaluation, Oxford University Press, vol. 18(4), pages 273-288, October.
    15. Ben R Martin, 2011. "The Research Excellence Framework and the ‘impact agenda’: are we creating a Frankenstein monster?," Research Evaluation, Oxford University Press, vol. 20(3), pages 247-254, September.
    16. Harvey Goldstein & David J. Spiegelhalter, 1996. "League Tables and Their Limitations: Statistical Issues in Comparisons of Institutional Performance," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 159(3), pages 385-409, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jonathan Linton & Nicholas Vonortas, 2015. "From Research Project to Research Portfolio: Meeting Scale and Complexity," Foresight and STI Governance (Foresight-Russia till No. 3/2015), National Research University Higher School of Economics, vol. 9(2), pages 38-43.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Froese, Anna & Woiwode, Hendrik & Suckow, Silvio, 2019. "Mission Impossible? Neue Wege zu Interdisziplinarität: Empfehlungen für Wissenschaft, Wissenschaftspolitik und Praxis," Discussion Papers, Research Group Science Policy Studies SP III 2019-601, WZB Berlin Social Science Center.
    2. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    3. Matteo Pedrini & Valentina Langella & Mario Alberto Battaglia & Paola Zaratin, 2018. "Assessing the health research’s social impact: a systematic review," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 1227-1250, March.
    4. Lutz Bornmann & Werner Marx, 2014. "How should the societal impact of research be generated and measured? A proposal for a simple and practicable approach to allow interdisciplinary comparisons," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 211-219, January.
    5. Degl’Innocenti, Marta & Matousek, Roman & Tzeremes, Nickolaos G., 2019. "The interconnections of academic research and universities’ “third mission”: Evidence from the UK," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    6. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    7. Bornmann, Lutz & Leydesdorff, Loet & Van den Besselaar, Peter, 2010. "A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications," Journal of Informetrics, Elsevier, vol. 4(3), pages 211-220.
    8. Michael C. Calver & Maggie Lilith & Christopher R. Dickman, 2013. "A ‘perverse incentive’ from bibliometrics: could National Research Assessment Exercises (NRAEs) restrict literature availability for nature conservation?," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 243-255, April.
    9. Richard McManus & Karen Mumford & Cristina Sechel, 2017. "The Selection of Economics Lecturers into the 2014 UK Research Excellence Framework Exercise: Outputs and Gender," Discussion Papers 17/16, Department of Economics, University of York.
    10. Sandström, Ulf & Van den Besselaar, Peter, 2018. "Funding, evaluation, and the performance of national research systems," Journal of Informetrics, Elsevier, vol. 12(1), pages 365-384.
    11. Irwin Feller, 2022. "Assessing the societal impact of publicly funded research," The Journal of Technology Transfer, Springer, vol. 47(3), pages 632-650, June.
    12. Marco Seeber & Jef Vlegels & Mattia Cattaneo, 2022. "Conditions that do or do not disadvantage interdisciplinary research proposals in project evaluation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(8), pages 1106-1126, August.
    13. Andrea Bonaccorsi & Filippo Chiarello & Gualtiero Fantoni, 2021. "Impact for whom? Mapping the users of public research with lexicon-based text mining," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1745-1774, February.
    14. Richard McManus & Karen Mumford & Cristina Sechel, 2022. "Measuring research excellence amongst economics lecturers in the UK," Bulletin of Economic Research, Wiley Blackwell, vol. 74(2), pages 386-404, April.
    15. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    16. Lutz Bornmann, 2013. "What is societal impact of research and how can it be assessed? a literature survey," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 217-233, February.
    17. Salter, Ammon & Salandra, Rossella & Walker, James, 2017. "Exploring preferences for impact versus publications among UK business and management academics," Research Policy, Elsevier, vol. 46(10), pages 1769-1782.
    18. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    19. Andrea Bonaccorsi & Nicola Melluso & Francesco Alessandro Massucci, 2022. "Exploring the antecedents of interdisciplinarity at the European Research Council: a topic modeling approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 6961-6991, December.
    20. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:elg:eechap:14384_5. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://www.e-elgar.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Darrel McCalla (email available below). General contact details of provider: http://www.e-elgar.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.