IDEAS home Printed from https://ideas.repec.org/a/gam/jpubli/v11y2023i2p33-d1162628.html
   My bibliography  Save this article

The Evaluation Gap in Astronomy—Explained through a Rational Choice Framework

Author

Listed:
  • Julia Heuritsch

    (Research Group “Reflexive Metrics”, Institut für Sozialwissenschaften, Humboldt Universität zu Berlin, Universitätsstraße 3b, 10117 Berlin, Germany)

Abstract

The concept of evaluation gaps captures potential discrepancies between what researchers value about their research, in particular research quality, and what metrics measure. The existence of evaluation gaps can give rise to questions about the relationship between intrinsic and extrinsic motivations to perform research, i.e., how field-specific notions of quality compete with notions captured via evaluation metrics, and consequently how researchers manage the balancing act between intrinsic values and requirements of evaluation procedures. This study analyses the evaluation gap from a rational choice point of view for the case of observational astronomers, based on a literature review and 19 semi-structured interviews with international astronomers. On the basis of the institutional norms and capital at play in academic astronomy, I shed light on the workings of the balancing act and its consequences on research quality in astronomy. I find that astronomers experience an anomie: they want to follow their intrinsic motivation to pursue science in order to push knowledge forward, while at the same time following their extrinsic motivation to comply with institutional norms. The balancing act is the art of serving performance indicators in order to stay in academia, while at the same time compromising research quality as little as possible. Gaming strategies shall give the appearance of compliance, while institutionalised means to achieve a good bibliometric record are used in innovative ways, such as salami slicing or going for easy publications. This leads to an overall decrease in research quality.

Suggested Citation

  • Julia Heuritsch, 2023. "The Evaluation Gap in Astronomy—Explained through a Rational Choice Framework," Publications, MDPI, vol. 11(2), pages 1-26, June.
  • Handle: RePEc:gam:jpubli:v:11:y:2023:i:2:p:33-:d:1162628
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2304-6775/11/2/33/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2304-6775/11/2/33/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Laudel, Grit & Gläser, Jochen, 2014. "Beyond breakthrough research: Epistemic properties of research and their consequences for research funding," Research Policy, Elsevier, vol. 43(7), pages 1204-1216.
    2. Blaise Cronin, 2001. "Hyperauthorship: A postmodern perversion or evidence of a structural shift in scholarly communication practices?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 52(7), pages 558-569.
    3. Peter Dahler-Larsen, 2014. "Constitutive Effects of Performance Indicators: Getting beyond unintended consequences," Public Management Review, Taylor & Francis Journals, vol. 16(7), pages 969-986, October.
    4. Stefan Thurner & Wenyuan Liu & Peter Klimek & Siew Ann Cheong, 2020. "The role of mainstreamness and interdisciplinarity for the relevance of scientific papers," PLOS ONE, Public Library of Science, vol. 15(4), pages 1-14, April.
    5. Michael J. Kurtz & Edwin A. Henneken, 2017. "Measuring metrics - a 40-year longitudinal cross-validation of citations, downloads, and peer review in astrophysics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(3), pages 695-708, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Julia Heuritsch, 2021. "Reflexive Behaviour: How Publication Pressure Affects Research Quality in Astronomy," Publications, MDPI, vol. 9(4), pages 1-23, November.
    2. Alberto Baccini & Giuseppe De Nicolao & Eugenio Petrovich, 2019. "Citation gaming induced by bibliometric evaluation: A country-level comparative analysis," PLOS ONE, Public Library of Science, vol. 14(9), pages 1-16, September.
    3. Nadine Desrochers & Adèle Paul‐Hus & Jen Pecoskie, 2017. "Five decades of gratitude: A meta‐synthesis of acknowledgments research," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(12), pages 2821-2833, December.
    4. Olle Persson & Wolfgang Glänzel, 2014. "Discouraging honorific authorship," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1417-1419, February.
    5. Jo Royle & Louisa Coles & Dorothy Williams & Paul Evans, 2007. "Publishing in international journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 71(1), pages 59-86, April.
    6. Franceschini, Fiorenzo & Maisano, Domenico, 2011. "Structured evaluation of the scientific output of academic research groups by recent h-based indicators," Journal of Informetrics, Elsevier, vol. 5(1), pages 64-74.
    7. Aniruddha Maiti & Sai Shi & Slobodan Vucetic, 2023. "An ablation study on the use of publication venue quality to rank computer science departments," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4197-4218, August.
    8. Chung-Souk Han, 2011. "On the demographical changes of U.S. research doctorate awardees and corresponding trends in research fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 845-865, December.
    9. Giliberto Capano & Benedetto Lepori, 2024. "Designing policies that could work: understanding the interaction between policy design spaces and organizational responses in public sector," Policy Sciences, Springer;Society of Policy Sciences, vol. 57(1), pages 53-82, March.
    10. Perianes-Rodriguez, Antonio & Waltman, Ludo & van Eck, Nees Jan, 2016. "Constructing bibliometric networks: A comparison between full and fractional counting," Journal of Informetrics, Elsevier, vol. 10(4), pages 1178-1195.
    11. Conor O’Kane & Jing A. Zhang & Jarrod Haar & James A. Cunningham, 2023. "How scientists interpret and address funding criteria: value creation and undesirable side effects," Small Business Economics, Springer, vol. 61(2), pages 799-826, August.
    12. Arsev U. Aydinoglu & Suzie Allard & Chad Mitchell, 2016. "Measuring diversity in disciplinary collaboration in research teams: An ecological perspective," Research Evaluation, Oxford University Press, vol. 25(1), pages 18-36.
    13. Fiorenzo Franceschini & Domenico Maisano & Luca Mastrogiacomo, 2014. "The citer-success-index: a citer-based indicator to select a subset of elite papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 963-983, November.
    14. Guillaume Cabanac, 2012. "Shaping the landscape of research in information systems from the perspective of editorial boards: A scientometric study of 77 leading journals," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(5), pages 977-996, May.
    15. Nils T. Hagen, 2010. "Deconstructing doctoral dissertations: how many papers does it take to make a PhD?," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(2), pages 567-579, November.
    16. Lucy Amez, 2012. "Citation measures at the micro level: Influence of publication age, field, and uncitedness," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(7), pages 1459-1465, July.
    17. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    18. Wallace, Matthew L. & Ràfols, Ismael, 2018. "Institutional shaping of research priorities: A case study on avian influenza," Research Policy, Elsevier, vol. 47(10), pages 1975-1989.
    19. Dorte Henriksen, 2016. "The rise in co-authorship in the social sciences (1980–2013)," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(2), pages 455-476, May.
    20. Yongjun Zhu & Lihong Quan & Pei‐Ying Chen & Meen Chul Kim & Chao Che, 2023. "Predicting coauthorship using bibliographic network embedding," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(4), pages 388-401, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jpubli:v:11:y:2023:i:2:p:33-:d:1162628. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.