Why an Unbiased External R&D Evaluation System is Important for the Progress of Social Sciencesâ€”the Case of a Small Social Science Community
This article deals with the impact of external R&D evaluations as one of the institutional factors that can encourage (or discourage) the progress of the social sciences. A critical overview is presented of the increasing use of bibliometric indicators in the external R&D evaluation procedures employed by the Slovenian Research Agency, which is the leading research council for financing the public sector of social sciences in Slovenia. We attempt to establish that, in order to ensure a good external R&D evaluation practice for a small social science community, it is insufficient to only have reliable bibliometric meta-databases. It is argued that it is equally important to formulate very precise criteria to ascertain their validity.
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Rémi Barré, 2010. "Towards socially robust S&T indicators: indicators as debatable devices, enabling collective learning," Research Evaluation, Oxford University Press, vol. 19(3), pages 227-231, September.
- Hess, David, 2007. "Social Reporting and New Governance Regulation: The Prospects of Achieving Corporate Accountability Through Transparency," Business Ethics Quarterly, Cambridge University Press, vol. 17(03), pages 453-476, July.
- Erik Ernø-Kjølhede & Finn Hansson, 2011. "Measuring research performance during a changing relationship between science and society," Research Evaluation, Oxford University Press, vol. 20(2), pages 131-143, June.
- Torres-Salinas, Daniel & Moed, Henk F., 2009. "Library Catalog Analysis as a tool in studies of social sciences and humanities: An exploratory study of published book titles in Economics," Journal of Informetrics, Elsevier, vol. 3(1), pages 9-26.
- Ulrich Schmoch & Torben Schubert & Dorothea Jansen & Richard Heidler & Regina von Görtz, 2010. "How to use indicators to measure scientific performance: a balanced approach," Research Evaluation, Oxford University Press, vol. 19(1), pages 2-18, March.
- Benedetto Lepori & Carole Probst, 2009. "Using curricula vitae for mapping scientific fields: a small-scale experience for Swiss communication sciences," Research Evaluation, Oxford University Press, vol. 18(2), pages 125-134, June.
- David Pontille & Didier Torny, 2010. "The controversial policies of journal ratings: evaluating social sciences and humanities," Research Evaluation, Oxford University Press, vol. 19(5), pages 347-360, December.
When requesting a correction, please mention this item's handle: RePEc:gam:jscscx:v:2:y:2013:i:4:p:284-297:d:31000. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (XML Conversion Team)
If references are entirely missing, you can add them using this form.