IDEAS home Printed from https://ideas.repec.org/p/wrk/warwec/999.html
   My bibliography  Save this paper

How Should Peer-Review Panels Behave?

Author

Listed:
  • Sgroi, Daniel

    (Department of Economics, University of Warwick)

  • Oswald, Andrew J.

    (Department of Economics and CAGE Centre, University of Warwick, and IZA Institute, Bonn.)

Abstract

Many governments wish to assess the quality of their universities. A prominent example is the UK’s new Research Excellence Framework (REF) 2014. In the REF, peer-review panels will be provided with information on publications and citations. This paper suggests a way in which panels could choose the weights to attach to these two indicators. The analysis draws in an intuitive way on the concept of Bayesian updating (where citations gradually reveal information about the initially imperfectly-observed importance of the research). Our study should not be interpreted as the argument that only mechanistic measures ought to be used in a REF. JEL classification: I23 ; C11 ; O30

Suggested Citation

  • Sgroi, Daniel & Oswald, Andrew J., 2012. "How Should Peer-Review Panels Behave?," The Warwick Economics Research Paper Series (TWERPS) 999, University of Warwick, Department of Economics.
  • Handle: RePEc:wrk:warwec:999
    as

    Download full text from publisher

    File URL: https://warwick.ac.uk/fac/soc/economics/research/workingpapers/2012/twerp_999.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. James R. Lothian & Mark P. Taylor, 2008. "Real Exchange Rates Over the Past Two Centuries: How Important is the Harrod‐Balassa‐Samuelson Effect?," Economic Journal, Royal Economic Society, vol. 118(532), pages 1742-1763, October.
    2. Sofronis Clerides & Panos Pashardes & Alexandros Polycarpou, 2006. "RAE Ratings and Research Quality: The Case of Economics Departments," University of Cyprus Working Papers in Economics 7-2006, University of Cyprus Department of Economics.
    3. Emmanuel Farhi & Josh Lerner & Jean Tirole, 2005. "Certifying New Technologies," Journal of the European Economic Association, MIT Press, vol. 3(2-3), pages 734-744, 04/05.
    4. Gill, David & Sgroi, Daniel, 2008. "Sequential decisions with tests," Games and Economic Behavior, Elsevier, vol. 63(2), pages 663-678, July.
    5. Goodall, Amanda H., 2009. "Highly cited leaders and the performance of research universities," Research Policy, Elsevier, vol. 38(7), pages 1079-1092, September.
    6. Frederic S. Lee, 2007. "The Research Assessment Exercise, the state and the dominance of mainstream economics in British universities," Cambridge Journal of Economics, Cambridge Political Economy Society, vol. 31(2), pages 309-325, March.
    7. Demange, Gabrielle, 2010. "Sharing information in Web communities," Games and Economic Behavior, Elsevier, vol. 68(2), pages 580-601, March.
    8. Jane Broadbent, 2010. "The UK Research Assessment Exercise: Performance Measurement and Resource Allocation," Australian Accounting Review, CPA Australia, vol. 20(1), pages 14-23, March.
    9. Andrew J. Oswald, 2007. "An Examination of the Reliability of Prestigious Scholarly Journals: Evidence and Implications for Decision‐Makers," Economica, London School of Economics and Political Science, vol. 74(293), pages 21-31, February.
    10. Gill, David & Sgroi, Daniel, 2012. "The optimal choice of pre-launch reviewer," Journal of Economic Theory, Elsevier, vol. 147(3), pages 1247-1260.
    11. Curtis R. Taylor, 1999. "Time-on-the-Market as a Sign of Quality," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 66(3), pages 555-578.
    12. Benjamin Chiao & Josh Lerner & Jean Tirole, 2007. "The rules of standard-setting organizations: an empirical analysis," RAND Journal of Economics, RAND Corporation, vol. 38(4), pages 905-930, December.
    13. Andrew J. Oswald, 2010. "A suggested method for the measurement of world-leading research (illustrated with data on economics)," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(1), pages 99-113, July.
    14. Osterloh, Margit & Frey, Bruno S., 2009. "Are more and better indicators the solution?," Scandinavian Journal of Management, Elsevier, vol. 25(2), pages 225-227, June.
    15. Paul Dolan & Daniel Kahneman, 2008. "Interpretations Of Utility And Their Implications For The Valuation Of Health," Economic Journal, Royal Economic Society, vol. 118(525), pages 215-234, January.
    16. Paul Collier & Anke Hoeffler, 2004. "Greed and grievance in civil war," Oxford Economic Papers, Oxford University Press, vol. 56(4), pages 563-595, October.
    17. William H. Starbuck, 2005. "How Much Better Are the Most-Prestigious Journals? The Statistics of Academic Publication," Organization Science, INFORMS, vol. 16(2), pages 180-200, April.
    18. Sgroi, Daniel, 2002. "Optimizing Information in the Herd: Guinea Pigs, Profits, and Welfare," Games and Economic Behavior, Elsevier, vol. 39(1), pages 137-166, April.
    19. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    20. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Does the h-index for ranking of scientists really work?," Scientometrics, Springer;Akadémiai Kiadó, vol. 65(3), pages 391-392, December.
    21. Daniel S. Hamermesh & Gerard A. Pfann, 2012. "Reputation And Earnings: The Roles Of Quality And Quantity In Academe," Economic Inquiry, Western Economic Association International, vol. 50(1), pages 1-16, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. David L. Anderson & John Tressler, 2017. "Researcher rank stability across alternative output measurement schemes in the context of a time limited research evaluation: the New Zealand case," Applied Economics, Taylor & Francis Journals, vol. 49(45), pages 4542-4553, September.
    2. Bertocchi, Graziella & Gambardella, Alfonso & Jappelli, Tullio & Nappi, Carmela A. & Peracchi, Franco, 2015. "Bibliometric evaluation vs. informed peer review: Evidence from Italy," Research Policy, Elsevier, vol. 44(2), pages 451-466.
    3. Stephan B. Bruns & David I. Stern, 2016. "Research assessment using early citation information," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(2), pages 917-935, August.
    4. David L. Anderson & John Tressler, 2016. "Citation-Capture Rates for Economics Journals: Do they Differ from Other Disciplines and Does it Matter?," Economic Papers, The Economic Society of Australia, vol. 35(1), pages 73-85, March.
    5. M. Ryan Haley, 2020. "Combining the weighted and unweighted Euclidean indices: a graphical approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 103-111, April.
    6. Drivas, Kyriakos & Kremmydas, Dimitris, 2020. "The Matthew effect of a journal's ranking," Research Policy, Elsevier, vol. 49(4).
    7. Gianni De Fraja & Giovanni Facchini & John Gathergood, 2016. "How Much Is That Star in the Window? Professorial Salaries and Research Performance in UK Universities," Discussion Papers 2016-13, University of Nottingham, GEP.
    8. Haley, M. Ryan & McGee, M. Kevin, 2020. "Jointly valuing journal visibility and author citation count: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 14(1).
    9. David I Stern, 2014. "High-Ranked Social Science Journal Articles Can Be Identified from Early Citation Information," PLOS ONE, Public Library of Science, vol. 9(11), pages 1-11, November.
    10. David L. Anderson & John Tressler, 2014. "Citation-Capture Rates by Economic Journals:Do they Differ from Other Disciplines and Does it Matter?," Working Papers in Economics 14/10, University of Waikato.
    11. John Gibson & David L. Anderson & John Tressler, 2017. "Citations Or Journal Quality: Which Is Rewarded More In The Academic Labor Market?," Economic Inquiry, Western Economic Association International, vol. 55(4), pages 1945-1965, October.
    12. repec:esx:essedp:757 is not listed on IDEAS
    13. Vasilios D. Kosteas, 2018. "Predicting long-run citation counts for articles in top economics journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1395-1412, June.
    14. M. Ryan Haley & M. Kevin McGee, 2023. "A flexible functional method for jointly valuing journal visibility and author citation count," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3337-3346, June.
    15. Oswald, Andrew J., 2015. "The Objective Measurement of World-Leading Research," IZA Discussion Papers 8829, Institute of Labor Economics (IZA).
    16. Régibeau, P & Rockett, K, 2014. "A Tale of Two Metrics: Research Assessment vs Recognised Excellence," Economics Discussion Papers 14461, University of Essex, Department of Economics.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gill, David & Sgroi, Daniel, 2008. "The Optimal Choice of Pre-launch Reviewer : How Best to Transmit Information using Tests and Conditional Pricing," The Warwick Economics Research Paper Series (TWERPS) 877, University of Warwick, Department of Economics.
    2. Gill, David & Sgroi, Daniel, 2012. "The optimal choice of pre-launch reviewer," Journal of Economic Theory, Elsevier, vol. 147(3), pages 1247-1260.
    3. Gill, David & Sgroi, Daniel, 2008. "Sequential decisions with tests," Games and Economic Behavior, Elsevier, vol. 63(2), pages 663-678, July.
    4. Oswald, Andrew J., 2015. "The Objective Measurement of World-Leading Research," IZA Discussion Papers 8829, Institute of Labor Economics (IZA).
    5. J. Atsu Amegashie, 2020. "Citations And Incentives In Academic Contests," Economic Inquiry, Western Economic Association International, vol. 58(3), pages 1233-1244, July.
    6. Margit Osterloh & Bruno S. Frey, 2010. "Academic rankings and research governance," IEW - Working Papers 482, Institute for Empirical Research in Economics - University of Zurich.
    7. Margit Osterloh & Bruno S. Frey, 2009. "Research Governance in Academia: Are there Alternatives to Academic Rankings?," CREMA Working Paper Series 2009-17, Center for Research in Economics, Management and the Arts (CREMA).
    8. Nicolás Figueroa & Carla Guadalupi, 2017. "Convincing early adopters: Price signals and Information transmission," Documentos de Trabajo 486, Instituto de Economia. Pontificia Universidad Católica de Chile..
    9. Ting Liu & Pasquale Schiraldi, 2012. "New product launch: herd seeking or herd preventing?," Economic Theory, Springer;Society for the Advancement of Economic Theory (SAET), vol. 51(3), pages 627-648, November.
    10. Osterloh, Margit & Frey, Bruno S., 2020. "How to avoid borrowed plumes in academia," Research Policy, Elsevier, vol. 49(1).
    11. Subir Bose & Gerhard Orosel & Marco Ottaviani & Lise Vesterlund, 2008. "Monopoly pricing in the binary herding model," Economic Theory, Springer;Society for the Advancement of Economic Theory (SAET), vol. 37(2), pages 203-241, November.
    12. David L. Anderson & John Tressler, 2013. "The Relevance of the “h-” and “g-” Index to Economics in the Context of A Nation-Wide Research Evaluation Scheme: The New Zealand Case," Economic Papers, The Economic Society of Australia, vol. 32(1), pages 81-94, March.
    13. Rafols, Ismael & Leydesdorff, Loet & O’Hare, Alice & Nightingale, Paul & Stirling, Andy, 2012. "How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business & Management," Research Policy, Elsevier, vol. 41(7), pages 1262-1282.
    14. Rebora, Gianfranco & Turri, Matteo, 2013. "The UK and Italian research assessment exercises face to face," Research Policy, Elsevier, vol. 42(9), pages 1657-1666.
    15. John Tressler & David L. Anderson, 2012. "Citations as a Measure of the Research Outputs of New Zealand's Economics Departments: The Problem of 'Long and Variable Lags'," Agenda - A Journal of Policy Analysis and Reform, Australian National University, College of Business and Economics, School of Economics, vol. 19(1), pages 17-40.
    16. Hicks, Diana, 2012. "Performance-based university research funding systems," Research Policy, Elsevier, vol. 41(2), pages 251-261.
    17. Thelwall, Mike & Fairclough, Ruth, 2015. "The influence of time and discipline on the magnitude of correlations between citation counts and quality scores," Journal of Informetrics, Elsevier, vol. 9(3), pages 529-541.
    18. Daniel S. Hamermesh, 2018. "Citations in Economics: Measurement, Uses, and Impacts," Journal of Economic Literature, American Economic Association, vol. 56(1), pages 115-156, March.
    19. Lutz Bornmann & Robin Haunschild, 2017. "Does evaluative scientometrics lose its main focus on scientific quality by the new orientation towards societal impact?," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(2), pages 937-943, February.
    20. Goodall, Amanda H. & McDowell, John M. & Singell, Larry D., 2014. "Leadership and the Research Productivity of University Departments," IZA Discussion Papers 7903, Institute of Labor Economics (IZA).

    More about this item

    Keywords

    University evaluation ; RAE Research Assessment Exercise 2008 ; citations ; bibliometrics; REF 2014 (Research Excellence Framework) ; Bayesian methods.;
    All these keywords.

    JEL classification:

    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions
    • C11 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Bayesian Analysis: General
    • O30 - Economic Development, Innovation, Technological Change, and Growth - - Innovation; Research and Development; Technological Change; Intellectual Property Rights - - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wrk:warwec:999. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Margaret Nash (email available below). General contact details of provider: https://edirc.repec.org/data/dewaruk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.