IDEAS home Printed from https://ideas.repec.org/p/feb/artefa/00612.html
   My bibliography  Save this paper

Redefine Statistical Significance

Author

Listed:
  • Daniel Benjamin
  • James Berger
  • Magnus Johannesson
  • Brian Nosek
  • E. Wagenmakers
  • Richard Berk
  • Kenneth Bollen
  • Bjorn Brembs
  • Lawrence Brown
  • Colin Camerer
  • David Cesarini
  • Christopher Chambers
  • Merlise Clyde
  • Thomas Cook
  • Paul De Boeck
  • Zoltan Dienes
  • Anna Dreber
  • Kenny Easwaran
  • Charles Efferson
  • Ernst Fehr
  • Fiona Fidler
  • Andy Field
  • Malcom Forster
  • Edward George
  • Tarun Ramadorai
  • Richard Gonzalez
  • Steven Goodman
  • Edwin Green
  • Donald Green
  • Anthony Greenwald
  • Jarrod Hadfield
  • Larry Hedges
  • Leonhard Held
  • Teck Hau Ho
  • Herbert Hoijtink
  • James Jones
  • Daniel Hruschka
  • Kosuke Imai
  • Guido Imbens
  • John Ioannidis
  • Minjeong Jeon
  • Michael Kirchler
  • David Laibson
  • John List
  • Roderick Little
  • Arthur Lupia
  • Edouard Machery
  • Scott Maxwell
  • Michael McCarthy
  • Don Moore
  • Stephen Morgan
  • Marcus Munafo
  • Shinichi Nakagawa
  • Brendan Nyhan
  • Timothy Parker
  • Luis Pericchi
  • Marco Perugini
  • Jeff Rouder
  • Judith Rousseau
  • Victoria Savalei
  • Felix Schonbrodt
  • Thomas Sellke
  • Betsy Sinclair
  • Dustin Tingley
  • Trisha Zandt
  • Simine Vazire
  • Duncan Watts
  • Christopher Winship
  • Robert Wolpert
  • Yu Xie
  • Cristobal Young
  • Jonathan Zinman
  • Valen Johnson

Abstract

We propose to change the default P-value threshold for statistical significance for claims of new discoveries from 0.05 to 0.005.

Suggested Citation

  • Daniel Benjamin & James Berger & Magnus Johannesson & Brian Nosek & E. Wagenmakers & Richard Berk & Kenneth Bollen & Bjorn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Christopher Chambe, 2017. "Redefine Statistical Significance," Artefactual Field Experiments 00612, The Field Experiments Website.
  • Handle: RePEc:feb:artefa:00612
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00612.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    • Daniel J. Benjamin & James O. Berger & Magnus Johannesson & Brian A. Nosek & E.-J. Wagenmakers & Richard Berk & Kenneth A. Bollen & Björn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Chr, 2018. "Redefine statistical significance," Nature Human Behaviour, Nature, vol. 2(1), pages 6-10, January.

    References listed on IDEAS

    as
    1. Sellke T. & Bayarri M. J. & Berger J. O., 2001. "Calibration of rho Values for Testing Precise Null Hypotheses," The American Statistician, American Statistical Association, vol. 55, pages 62-71, February.
    2. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    2. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    3. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    4. Tom Coupé & W. Robert Reed, 2021. "Do Negative Replications Affect Citations?," Working Papers in Economics 21/14, University of Canterbury, Department of Economics and Finance.
    5. Mueller-Langer, Frank & Andreoli-Versbach, Patrick, 2018. "Open access to research data: Strategic delay and the ambiguous welfare effects of mandatory data disclosure," Information Economics and Policy, Elsevier, vol. 42(C), pages 20-34.
    6. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    7. Gary Koop & Roberto Leon-Gonzalez & Rodney Strachan, 2008. "Bayesian inference in a cointegrating panel data model," Advances in Econometrics, in: Bayesian Econometrics, pages 433-469, Emerald Group Publishing Limited.
    8. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    9. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    10. Galanis, S. & Ioannou, C. & Kotronis, S., 2019. "Information Aggregation Under Ambiguity: Theory and Experimental Evidence," Working Papers 20/05, Department of Economics, City University London.
    11. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2020. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics (RM/19/029-revised-)," Research Memorandum 014, Maastricht University, Graduate School of Business and Economics (GSBE).
    12. Doucouliagos, Hristos & Paldam, Martin & Stanley, T.D., 2018. "Skating on thin evidence: Implications for public policy," European Journal of Political Economy, Elsevier, vol. 54(C), pages 16-25.
    13. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    14. Kai Ruggeri & Amma Panin & Milica Vdovic & Bojana Većkalov & Nazeer Abdul-Salaam & Jascha Achterberg & Carla Akil & Jolly Amatya & Kanchan Amatya & Thomas Lind Andersen & Sibele D. Aquino & Arjoon Aru, 2022. "The globalizability of temporal discounting," Nature Human Behaviour, Nature, vol. 6(10), pages 1386-1397, October.
    15. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    16. Bigoni, Maria & Camera, Gabriele & Casari, Marco, 2020. "Money is more than memory," Journal of Monetary Economics, Elsevier, vol. 110(C), pages 99-115.
    17. Roy Chen & Yan Chen & Yohanes E. Riyanto, 2021. "Best practices in replication: a case study of common information in coordination games," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 2-30, March.
    18. Okunade, Albert & Osmani, Ahmad, 2020. "Effects of life expectancy on economic growth: New results using the flexible Box-Cox power transformation model," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, issue Latest Ar.
    19. Brice Corgnet & Cary Deck & Mark DeSantis & Kyle Hampton & Erik O. Kimbrough, 2023. "When Do Security Markets Aggregate Dispersed Information?," Management Science, INFORMS, vol. 69(6), pages 3697-3729, June.
    20. Lorko, Matej & Servátka, Maroš & Zhang, Le, 2023. "Hidden inefficiency: Strategic inflation of project schedules," Journal of Economic Behavior & Organization, Elsevier, vol. 206(C), pages 313-326.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:artefa:00612. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: David Franks (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.