IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0283980.html
   My bibliography  Save this article

Cite-seeing and reviewing: A study on citation bias in peer review

Author

Listed:
  • Ivan Stelmakh
  • Charvi Rastogi
  • Ryan Liu
  • Shuchi Chawla
  • Federico Echenique
  • Nihar B Shah

Abstract

Citations play an important role in researchers’ careers as a key factor in evaluation of scientific impact. Many anecdotes advice authors to exploit this fact and cite prospective reviewers to try obtaining a more positive evaluation for their submission. In this work, we investigate if such a citation bias actually exists: Does the citation of a reviewer’s own work in a submission cause them to be positively biased towards the submission? In conjunction with the review process of two flagship conferences in machine learning and algorithmic economics, we execute an observational study to test for citation bias in peer review. In our analysis, we carefully account for various confounding factors such as paper quality and reviewer expertise, and apply different modeling techniques to alleviate concerns regarding the model mismatch. Overall, our analysis involves 1,314 papers and 1,717 reviewers and detects citation bias in both venues we consider. In terms of the effect size, by citing a reviewer’s work, a submission has a non-trivial chance of getting a higher score from the reviewer: an expected increase in the score is approximately 0.23 on a 5-point Likert item. For reference, a one-point increase of a score by a single reviewer improves the position of a submission by 11% on average.

Suggested Citation

  • Ivan Stelmakh & Charvi Rastogi & Ryan Liu & Shuchi Chawla & Federico Echenique & Nihar B Shah, 2023. "Cite-seeing and reviewing: A study on citation bias in peer review," PLOS ONE, Public Library of Science, vol. 18(7), pages 1-16, July.
  • Handle: RePEc:plo:pone00:0283980
    DOI: 10.1371/journal.pone.0283980
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0283980
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0283980&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0283980?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Danielle Li, 2017. "Expertise versus Bias in Evaluation: Evidence from the NIH," American Economic Journal: Applied Economics, American Economic Association, vol. 9(2), pages 60-92, April.
    2. Cassidy R. Sugimoto & Blaise Cronin, 2013. "Citation gamesmanship: testing for evidence of ego bias in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(3), pages 851-862, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ryan Liu & Steven Jecmen & Vincent Conitzer & Fei Fang & Nihar B Shah, 2024. "Testing for reviewer anchoring in peer review: A randomized controlled trial," PLOS ONE, Public Library of Science, vol. 19(11), pages 1-19, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mitchell Hoffman & Lisa B Kahn & Danielle Li, 2018. "Discretion in Hiring," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 133(2), pages 765-800.
    2. Nicolas Carayol & Emeric Henry & Marianne Lanoë, 2020. "Stimulating Peer Effects? Evidence from a Research Cluster Policy," Working Papers hal-03874261, HAL.
    3. Chen, Yu & Wang, Yuandi & Hu, Die & Zhou, Zhao, 2020. "Government R&D subsidies, information asymmetry, and the role of foreign investors: Evidence from a quasi-natural experiment on the shanghai-hong kong stock connect," Technological Forecasting and Social Change, Elsevier, vol. 158(C).
    4. Jonas Radbruch & Amelie Schiprowski, 2025. "Interview Sequences and the Formation of Subjective Assessments," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 92(2), pages 1226-1256.
    5. Lawson, Cornelia & Salter, Ammon, 2023. "Exploring the effect of overlapping institutional applications on panel decision-making," Research Policy, Elsevier, vol. 52(9).
    6. Vollaard, Ben & van Ours, Jan C., 2022. "Bias in expert product reviews," Journal of Economic Behavior & Organization, Elsevier, vol. 202(C), pages 105-118.
    7. Kyle R. Myers, 2022. "Some Tradeoffs of Competition in Grant Contests," Papers 2207.02379, arXiv.org, revised Mar 2024.
    8. Pierre Deschamps, 2024. "Gender Quotas in Hiring Committees: A Boon or a Bane for Women?," Management Science, INFORMS, vol. 70(11), pages 7486-7505, November.
    9. Amitabh Chandra & Courtney Coile & Corina Mommaerts, 2023. "What Can Economics Say about Alzheimer's Disease?," Journal of Economic Literature, American Economic Association, vol. 61(2), pages 428-470, June.
    10. Mitchell Hoffman & Christopher T. Stanton, 2024. "People, Practices, and Productivity: A Review of New Advances in Personnel Economics," NBER Working Papers 32849, National Bureau of Economic Research, Inc.
    11. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.
    12. Wei Fu & Shin-Yi Chou & Li-San Wang, 2022. "NIH Grant Expansion, Ancestral Diversity and Scientific Discovery in Genomics Research," NBER Working Papers 30155, National Bureau of Economic Research, Inc.
    13. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    14. David Card & Stefano DellaVigna, 2020. "What Do Editors Maximize? Evidence from Four Economics Journals," The Review of Economics and Statistics, MIT Press, vol. 102(1), pages 195-217, March.
    15. David C Chan & Michael J Dickstein, 2019. "Industry Input in Policy Making: Evidence from Medicare," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 134(3), pages 1299-1342.
    16. Christoph Riedl & Tom Grad & Christopher Lettl, 2024. "Competition and Collaboration in Crowdsourcing Communities: What happens when peers evaluate each other?," Papers 2404.14141, arXiv.org.
    17. Jorge Guzman & Fiona Murray & Scott Stern & Heidi Williams, 2024. "Accelerating Innovation Ecosystems: The Promise and Challenges of Regional Innovation Engines," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 3(1), pages 9-75.
    18. Mikko Packalen & Jay Bhattacharya, 2018. "Does the NIH Fund Edge Science?," NBER Working Papers 24860, National Bureau of Economic Research, Inc.
    19. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    20. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Chapters, in: Innovation and Public Policy, pages 117-150, National Bureau of Economic Research, Inc.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0283980. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.