IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/ps38b.html
   My bibliography  Save this paper

A Poor Prognosis for the Diagnostic Screening Critique of Statistical Tests

Author

Listed:
  • Mayo, Deborah
  • Morey, Richard Donald

Abstract

Recently, a number of statistical reformers have argued for conceptualizing significance testing as analogous to diagnostic testing, with a "base rate" of true nulls and a test's error probabilities used to compute a "positive predictive value" or "false discovery rate". These quantities are used to critique statistical and scientific practice. We argue against this; these quantities are not relevant for evaluating statistical tests, they add to the confusion over significance testing, and they take the focus away from what matters: evidence.

Suggested Citation

  • Mayo, Deborah & Morey, Richard Donald, 2017. "A Poor Prognosis for the Diagnostic Screening Critique of Statistical Tests," OSF Preprints ps38b, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:ps38b
    DOI: 10.31219/osf.io/ps38b
    as

    Download full text from publisher

    File URL: https://osf.io/download/597839476c613b022938a11c/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/ps38b?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Sellke T. & Bayarri M. J. & Berger J. O., 2001. "Calibration of rho Values for Testing Precise Null Hypotheses," The American Statistician, American Statistical Association, vol. 55, pages 62-71, February.
    3. Ronald L. Wasserstein & Nicole A. Lazar, 2016. "The ASA's Statement on p -Values: Context, Process, and Purpose," The American Statistician, Taylor & Francis Journals, vol. 70(2), pages 129-133, May.
    4. Steven Goodman & Sander Greenland, 2007. "Why Most Published Research Findings Are False: Problems in the Analysis," PLOS Medicine, Public Library of Science, vol. 4(4), pages 1-1, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sam Sims & Jake Anders & Matthew Inglis & Hugues Lortie-Forgues, 2020. "Quantifying 'promising trials bias' in randomized controlled trials in education," CEPEO Working Paper Series 20-16, UCL Centre for Education Policy and Equalising Opportunities, revised Nov 2020.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    2. Jesper W. Schneider, 2015. "Null hypothesis significance tests. A mix-up of two different theories: the basis for widespread confusion and numerous misinterpretations," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 411-432, January.
    3. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    4. Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.
    5. Michaelides, Michael, 2021. "Large sample size bias in empirical finance," Finance Research Letters, Elsevier, vol. 41(C).
    6. Nicolas Vallois & Dorian Jullien, 2017. "Estimating Rationality in Economics: A History of Statistical Methods in Experimental Economics," Working Papers halshs-01651070, HAL.
    7. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    8. David Spiegelhalter, 2017. "Trust in numbers," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(4), pages 948-965, October.
    9. Julia Roloff & Michael J. Zyphur, 2019. "Null Findings, Replications and Preregistered Studies in Business Ethics Research," Journal of Business Ethics, Springer, vol. 160(3), pages 609-619, December.
    10. Pathairat Pastpipatkul & Petchaluck Boonyakunakorn & Kanyaphon Phetsakda, 2020. "The Impact of Thailand’s Openness on Bilateral Trade between Thailand and Japan: Copula-Based Markov Switching Seemingly Unrelated Regression Model," Economies, MDPI, vol. 8(1), pages 1-13, January.
    11. Robert Rieg, 2018. "Tasks, interaction and role perception of management accountants: evidence from Germany," Journal of Management Control: Zeitschrift für Planung und Unternehmenssteuerung, Springer, vol. 29(2), pages 183-220, August.
    12. Nicolas Vallois & Dorian Jullien, 2018. "A history of statistical methods in experimental economics," The European Journal of the History of Economic Thought, Taylor & Francis Journals, vol. 25(6), pages 1455-1492, November.
    13. Andrew Y. Chen & Tom Zimmermann, 2022. "Publication Bias in Asset Pricing Research," Papers 2209.13623, arXiv.org, revised Sep 2023.
    14. Herman Carstens & Xiaohua Xia & Sarma Yadavalli, 2018. "Bayesian Energy Measurement and Verification Analysis," Energies, MDPI, vol. 11(2), pages 1-20, February.
    15. Stephan B. Bruns & David I. Stern, 2019. "Lag length selection and p-hacking in Granger causality testing: prevalence and performance of meta-regression models," Empirical Economics, Springer, vol. 56(3), pages 797-830, March.
    16. Rigdon, Edward E., 2016. "Choosing PLS path modeling as analytical method in European management research: A realist perspective," European Management Journal, Elsevier, vol. 34(6), pages 598-605.
    17. Kim, Jae H. & Ji, Philip Inyeob, 2015. "Significance testing in empirical finance: A critical review and assessment," Journal of Empirical Finance, Elsevier, vol. 34(C), pages 1-14.
    18. Lars Ole Schwen & Sabrina Rueschenbaum, 2018. "Ten quick tips for getting the most scientific value out of numerical data," PLOS Computational Biology, Public Library of Science, vol. 14(10), pages 1-21, October.
    19. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    20. Hirschauer Norbert & Grüner Sven & Mußhoff Oliver & Frey Ulrich & Theesfeld Insa & Wagner Peter, 2016. "Die Interpretation des p-Wertes – Grundsätzliche Missverständnisse," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 236(5), pages 557-575, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:ps38b. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.