IDEAS home Printed from https://ideas.repec.org/a/eee/respol/v47y2018i9p1825-1841.html
   My bibliography  Save this article

The sociology of scientific validity: How professional networks shape judgement in peer review

Author

Listed:
  • Teplitskiy, Misha
  • Acuna, Daniel
  • Elamrani-Raoult, Aïda
  • Körding, Konrad
  • Evans, James

Abstract

Professional connections between the creators and evaluators of scientific work are ubiquitous, and the possibility of bias ever-present. Although connections have been shown to bias predictions of uncertain future performance, it is unknown whether such biases occur in the more concrete task of assessing scientific validity for completed works, and if so, how. This study presents evidence that connections between authors and reviewers of neuroscience manuscripts are associated with biased judgments and explores the mechanisms driving that effect. Using reviews from 7981 neuroscience manuscripts submitted to the journal PLOS ONE, which instructs reviewers to evaluate manuscripts on scientific validity alone, we find that reviewers favored authors close in the co-authorship network by ∼0.11 points on a 1.0–4.0 scale for each step of proximity. PLOS ONE’s validity-focused review and the substantial favoritism shown by distant vs. very distant reviewers, both of whom should have little to gain from nepotism, point to the central role of substantive disagreements between scientists in different professional networks (“schools of thought”). These results suggest that removing bias from peer review cannot be accomplished simply by recusing closely connected reviewers, and highlight the value of recruiting reviewers embedded in diverse professional networks.

Suggested Citation

  • Teplitskiy, Misha & Acuna, Daniel & Elamrani-Raoult, Aïda & Körding, Konrad & Evans, James, 2018. "The sociology of scientific validity: How professional networks shape judgement in peer review," Research Policy, Elsevier, vol. 47(9), pages 1825-1841.
  • Handle: RePEc:eee:respol:v:47:y:2018:i:9:p:1825-1841
    DOI: 10.1016/j.respol.2018.06.014
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0048733318301598
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.respol.2018.06.014?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Benjamin F. Jones, 2009. "The Burden of Knowledge and the "Death of the Renaissance Man": Is Innovation Getting Harder?," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 76(1), pages 283-317.
    2. Manuel Bagues & Mauro Sylos-Labini & Natalia Zinovyeva, 2017. "Does the Gender Composition of Scientific Committees Matter?," American Economic Review, American Economic Association, vol. 107(4), pages 1207-1238, April.
    3. Meike Olbrecht & Lutz Bornmann, 2010. "Panel peer review of grant applications: what do we know from research in social psychology on judgment and decision-making in groups?," Research Evaluation, Oxford University Press, vol. 19(4), pages 293-304, October.
    4. Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel, 2010. "A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants," PLOS ONE, Public Library of Science, vol. 5(12), pages 1-10, December.
    5. John-Paul Ferguson & Gianluca Carnabuci, 2017. "Risky Recombinations: Institutional Gatekeeping in the Innovation Process," Organization Science, INFORMS, vol. 28(1), pages 133-151, February.
    6. Lutz Bornmann, 2013. "What is societal impact of research and how can it be assessed? a literature survey," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 217-233, February.
    7. repec:cup:judgdm:v:4:y:2009:i:1:p:41-50 is not listed on IDEAS
    8. Bagues, Manuel & Sylos-Labini, Mauro & Zinovyeva, Natalia, 2019. "Connections in scientific committees and applicants’ self-selection: Evidence from a natural randomized experiment," Labour Economics, Elsevier, vol. 58(C), pages 81-97.
    9. Laband, David N & Piette, Michael J, 1994. "Favoritism versus Search for Good Papers: Empirical Evidence Regarding the Behavior of Journal Editors," Journal of Political Economy, University of Chicago Press, vol. 102(1), pages 194-203, February.
    10. Marcia A. Mardis & Ellen S. Hoffman & Flora P. McMartin, 2012. "Toward broader impacts: Making sense of NSF's merit review criteria in the context of the National Science Digital Library," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(9), pages 1758-1772, September.
    11. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    12. Edward P. Lazear, 2000. "Economic Imperialism," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(1), pages 99-146.
    13. Christine Wennerås & Agnes Wold, 1997. "Nepotism and sexism in peer-review," Nature, Nature, vol. 387(6631), pages 341-343, May.
    14. Upali W. Jayasinghe & Herbert W. Marsh & Nigel Bond, 2003. "A multilevel cross‐classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 166(3), pages 279-300, October.
    15. Danielle Li, 2017. "Expertise versus Bias in Evaluation: Evidence from the NIH," American Economic Journal: Applied Economics, American Economic Association, vol. 9(2), pages 60-92, April.
    16. van den Besselaar, Peter, 2012. "Selection committee membership: Service or self-service," Journal of Informetrics, Elsevier, vol. 6(4), pages 580-585.
    17. Jian-Gao Yao & Xin Gao & Hong-Mei Yan & Chao-Yi Li, 2011. "Field of Attention for Instantaneous Object Recognition," PLOS ONE, Public Library of Science, vol. 6(1), pages 1-8, January.
    18. Marcia A. Mardis & Ellen S. Hoffman & Flora P. McMartin, 2012. "Toward broader impacts: Making sense of NSF's merit review criteria in the context of the National Science Digital Library," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(9), pages 1758-1772, September.
    19. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lokman Tutuncu & Recep Yucedogru & Idris Sarisoy, 2022. "Academic favoritism at work: insider bias in Turkish national journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2547-2576, May.
    2. Lokman Tutuncu, 2023. "All-pervading insider bias alters review time in Turkish university journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3743-3791, June.
    3. ederico Bianchi & Flaminio Squazzoni, 2022. "Can transparency undermine peer review? A simulation model of scientist behavior under open peer review [Reviewing Peer Review]," Science and Public Policy, Oxford University Press, vol. 49(5), pages 791-800.
    4. Federica Bologna & Angelo Iorio & Silvio Peroni & Francesco Poggi, 2023. "Do open citations give insights on the qualitative peer-review evaluation in research assessments? An analysis of the Italian National Scientific Qualification," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 19-53, January.
    5. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    6. Balietti, Stefano & Riedl, Christoph, 2021. "Incentives, competition, and inequality in markets for creative production," Research Policy, Elsevier, vol. 50(4).
    7. Petersen, Alexander M., 2019. "Megajournal mismanagement: Manuscript decision bias and anomalous editor activity at PLOS ONE," Journal of Informetrics, Elsevier, vol. 13(4).
    8. Zhang, Guangyao & Xu, Shenmeng & Sun, Yao & Jiang, Chunlin & Wang, Xianwen, 2022. "Understanding the peer review endeavor in scientific publishing," Journal of Informetrics, Elsevier, vol. 16(2).
    9. Nida ul Habib Bajwa & Markus Langer & Cornelius J. König & Hannah Honecker, 2019. "What might get published in management and applied psychology? Experimentally manipulating implicit expectations of reviewers regarding hedges," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(3), pages 1351-1371, September.
    10. Carol Nash, 2023. "Roles and Responsibilities for Peer Reviewers of International Journals," Publications, MDPI, vol. 11(2), pages 1-24, June.
    11. Ductor, Lorenzo & Visser, Bauke, 2022. "When a coauthor joins an editorial board," Journal of Economic Behavior & Organization, Elsevier, vol. 200(C), pages 576-595.
    12. Akbaritabar, Aliakbar & Stephen, Dimity & Squazzoni, Flaminio, 2022. "A study of referencing changes in preprint-publication pairs across multiple fields," Journal of Informetrics, Elsevier, vol. 16(2).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    2. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    3. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    4. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    5. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    6. Lawson, Cornelia & Salter, Ammon, 2023. "Exploring the effect of overlapping institutional applications on panel decision-making," Research Policy, Elsevier, vol. 52(9).
    7. David Card & Stefano DellaVigna, 2017. "What do Editors Maximize? Evidence from Four Leading Economics Journals," NBER Working Papers 23282, National Bureau of Economic Research, Inc.
    8. David Card & Stefano DellaVigna, 2020. "What Do Editors Maximize? Evidence from Four Economics Journals," The Review of Economics and Statistics, MIT Press, vol. 102(1), pages 195-217, March.
    9. Albert Banal-Estañol & Inés Macho-Stadler & David Pérez-Castillo, 2019. "Funding academic research: grant application, partnership, award, and output," Economics Working Papers 1658, Department of Economics and Business, Universitat Pompeu Fabra.
    10. Vincent Chandler, 2019. "Identifying emerging scholars: seeing through the crystal ball of scholarship selection committees," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 39-56, July.
    11. Kevin J. Boudreau & Karim R. Lakhani, 2016. "Innovation Experiments: Researching Technical Advance, Knowledge Production, and the Design of Supporting Institutions," Innovation Policy and the Economy, University of Chicago Press, vol. 16(1), pages 135-167.
    12. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    13. Lokman Tutuncu, 2023. "All-pervading insider bias alters review time in Turkish university journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3743-3791, June.
    14. Laura Hospido & Carlos Sanz, 2021. "Gender Gaps in the Evaluation of Research: Evidence from Submissions to Economics Conferences," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 83(3), pages 590-618, June.
    15. Bradford Demarest & Guo Freeman & Cassidy R. Sugimoto, 2014. "The reviewer in the mirror: examining gendered and ethnicized notions of reciprocity in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 717-735, October.
    16. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
    17. Marco Seeber & Jef Vlegels & Mattia Cattaneo, 2022. "Conditions that do or do not disadvantage interdisciplinary research proposals in project evaluation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(8), pages 1106-1126, August.
    18. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    19. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    20. Bransch, Felix & Kvasnicka, Michael, 2022. "Male Gatekeepers: Gender Bias in the Publishing Process?," Journal of Economic Behavior & Organization, Elsevier, vol. 202(C), pages 714-732.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:respol:v:47:y:2018:i:9:p:1825-1841. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/respol .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.