IDEAS home Printed from https://ideas.repec.org/a/eee/respol/v48y2019i2p506-515.html
   My bibliography  Save this article

Evaluating solutions to the problem of false positives

Author

Listed:
  • Gall, Thomas
  • Maniadis, Zacharias

Abstract

A current challenge for the scientific community is the choice of appropriate policies to reduce the rate of false positives. Existing proposals differ in whether to prioritize tackling omission through transparency requirements, punishing more severe transgressions, or possibly both. We use a formal model to evaluate these possible solutions. We find that a policy that prohibitively increases the cost of ‘misdemeanor’ types of questionable research practices robustly decreases the overall rate of researcher misconduct, because the rate of ‘felonies’, such as fabrication, also decreases. Therefore proposals that aim to prevent lying by omission by enforcing reporting guidelines are likely to be effective in reducing researcher misconduct, but measures such as government audits (purported to counteract pure fraud) can backfire. Moreover, we find that an increase in the rewards of publication need not increase overall misconduct.

Suggested Citation

  • Gall, Thomas & Maniadis, Zacharias, 2019. "Evaluating solutions to the problem of false positives," Research Policy, Elsevier, vol. 48(2), pages 506-515.
  • Handle: RePEc:eee:respol:v:48:y:2019:i:2:p:506-515
    DOI: 10.1016/j.respol.2017.12.005
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0048733317302111
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.respol.2017.12.005?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Daniele Fanelli, 2013. "Redefine misconduct as distorted reporting," Nature, Nature, vol. 494(7436), pages 149-149, February.
    2. Navin Kartik, 2009. "Strategic Communication with Lying Costs," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 76(4), pages 1359-1395.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.
    5. David Moher & Alessandro Liberati & Jennifer Tetzlaff & Douglas G Altman & The PRISMA Group, 2009. "Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement," PLOS Medicine, Public Library of Science, vol. 6(7), pages 1-6, July.
    6. Glenn Ellison, 2002. "Evolving Standards for Academic Publishing: A q-r Theory," Journal of Political Economy, University of Chicago Press, vol. 110(5), pages 994-1034, October.
    7. Thomas Gall & John P A Ioannidis & Zacharias Maniadis, 2017. "The credibility crisis in research: Can economics tools help?," PLOS Biology, Public Library of Science, vol. 15(4), pages 1-13, April.
    8. Richard A. Bettis, 2012. "The search for asterisks: Compromised statistical tests and flawed theories," Strategic Management Journal, Wiley Blackwell, vol. 33(1), pages 108-113, January.
    9. Zacharias Maniadis & Fabio Tufano & John A. List, 2015. "How to Make Experimental Economics Research More Reproducible: Lessons from Other Disciplines and a New Proposal," Research in Experimental Economics, in: Replication in Experimental Economics, volume 18, pages 215-230, Emerald Group Publishing Limited.
    10. Kenneth F Schulz & Douglas G Altman & David Moher & for the CONSORT Group, 2010. "CONSORT 2010 Statement: Updated Guidelines for Reporting Parallel Group Randomised Trials," PLOS Medicine, Public Library of Science, vol. 7(3), pages 1-7, March.
    11. Drew Fudenberg & David K. Levine, 1998. "The Theory of Learning in Games," MIT Press Books, The MIT Press, edition 1, volume 1, number 0262061945, December.
    12. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    13. Daniele Fanelli, 2009. "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLOS ONE, Public Library of Science, vol. 4(5), pages 1-11, May.
    14. Sebastian Galiani & Paul Gertler & Mauricio Romero, 2017. "Incentives for Replication in Economics," NBER Working Papers 23576, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Emilija Stojmenova Duh & Andrej Duh & Uroš Droftina & Tim Kos & Urban Duh & Tanja Simonič Korošak & Dean Korošak, 2019. "Publish-and-Flourish: Using Blockchain Platform to Enable Cooperative Scholarly Communication," Publications, MDPI, vol. 7(2), pages 1-15, May.
    2. Horton, Joanne & Krishna Kumar, Dhanya & Wood, Anthony, 2020. "Detecting academic fraud using Benford law: The case of Professor James Hunton," Research Policy, Elsevier, vol. 49(8).
    3. Mohan, Vijay, 2019. "On the use of blockchain-based mechanisms to tackle academic misconduct," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    4. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    5. Salandra, Rossella & Criscuolo, Paola & Salter, Ammon, 2021. "Directing scientists away from potentially biased publications: the role of systematic reviews in health care," Research Policy, Elsevier, vol. 50(1).
    6. Herresthal, Claudia, 2022. "Hidden testing and selective disclosure of evidence," Journal of Economic Theory, Elsevier, vol. 200(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    2. Bergemann, Dirk & Ottaviani, Marco, 2021. "Information Markets and Nonmarkets," CEPR Discussion Papers 16459, C.E.P.R. Discussion Papers.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    5. Hensel, Przemysław G., 2019. "Supporting replication research in management journals: Qualitative analysis of editorials published between 1970 and 2015," European Management Journal, Elsevier, vol. 37(1), pages 45-57.
    6. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    7. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    8. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.
    9. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    10. Gary A. Hoover & Christian Hopp, 2017. "What Crisis? Taking Stock of Management Researchers' Experiences with and Views of Scholarly Misconduct," CESifo Working Paper Series 6611, CESifo.
    11. Michał Krawczyk, 2015. "The Search for Significance: A Few Peculiarities in the Distribution of P Values in Experimental Psychology Literature," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-19, June.
    12. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    13. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    14. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    15. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    16. Su Keng Tan & Wai Keung Leung & Alexander Tin Hong Tang & Roger A Zwahlen, 2017. "Effects of mandibular setback with or without maxillary advancement osteotomies on pharyngeal airways: An overview of systematic reviews," PLOS ONE, Public Library of Science, vol. 12(10), pages 1-20, October.
    17. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    18. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    19. Jovana Kuzmanovic Pficer & Slobodan Dodic & Vojkan Lazic & Goran Trajkovic & Natasa Milic & Biljana Milicic, 2017. "Occlusal stabilization splint for patients with temporomandibular disorders: Meta-analysis of short and long term effects," PLOS ONE, Public Library of Science, vol. 12(2), pages 1-21, February.
    20. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).

    More about this item

    Keywords

    Researcher misconduct; Reproducibility; False positives; Questionable research practices;
    All these keywords.

    JEL classification:

    • C72 - Mathematical and Quantitative Methods - - Game Theory and Bargaining Theory - - - Noncooperative Games
    • Z1 - Other Special Topics - - Cultural Economics
    • L38 - Industrial Organization - - Nonprofit Organizations and Public Enterprise - - - Public Policy

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:respol:v:48:y:2019:i:2:p:506-515. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/respol .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.