IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/32p8t.html
   My bibliography  Save this paper

The Ellipse of Insignificance: a refined fragility index for ascertaining robustness of results in dichotomous outcome trials

Author

Listed:
  • Grimes, David Robert

Abstract

There is increasing awareness throughout biomedical science that many results do not withstand the trials of repeat investigation. The growing abundance of medical literature has only increased the urgent need for tools to gauge the robustness and trustworthiness of published science. Dichotomous outcome designs are vital in randomized clinical trials, cohort studies, and observational data for ascertaining differences between experimental and control arms. It has however been shown with tools like the fragility index (FI) that many ostensibly impactful results fail to materialise when even small numbers of patients in either the control or experimental arms are recoded from event to non-event. Critics of this metric counter that there is no objective means to determine a meaningful FI. As currently used, FI is not multi-dimensional and is computationally expensive. In this work a conceptually similar geometrical approach is introduced, the ellipse of insignificance (EOI). This method yields precise deterministic values for the degree of manipulation or miscoding that can be tolerated simultaneously in both control and experimental arms, allowing for the derivation of objective measures of experimental robustness. More than this, the tool is intimately connected with sensitivity and specificity of the event / non-event tests, and is readily combined with knowledge of test parameters to reject unsound results. The method is outlined here, with illustrative clinical examples.

Suggested Citation

  • Grimes, David Robert, 2022. "The Ellipse of Insignificance: a refined fragility index for ascertaining robustness of results in dichotomous outcome trials," OSF Preprints 32p8t, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:32p8t
    DOI: 10.31219/osf.io/32p8t
    as

    Download full text from publisher

    File URL: https://osf.io/download/624232a11b0a6e00bbf798ba/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/32p8t?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Lonni Besançon & Elisabeth Bik & James Heathers & Gideon Meyerowitz-Katz, 2022. "Correction of scientific literature: Too little, too late!," PLOS Biology, Public Library of Science, vol. 20(3), pages 1-4, March.
    3. Michał Krawczyk, 2015. "The Search for Significance: A Few Peculiarities in the Distribution of P Values in Experimental Psychology Literature," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-19, June.
    4. Daniele Fanelli, 2009. "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLOS ONE, Public Library of Science, vol. 4(5), pages 1-11, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    2. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working and Discussion Papers WP 5/2020, Research Department, National Bank of Slovakia.
    3. Stephan B Bruns & John P A Ioannidis, 2016. "p-Curve and p-Hacking in Observational Research," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-13, February.
    4. David Spiegelhalter, 2017. "Trust in numbers," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(4), pages 948-965, October.
    5. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    6. Fabo, Brian & Jančoková, Martina & Kempf, Elisabeth & Pástor, Ľuboš, 2021. "Fifty shades of QE: Comparing findings of central bankers and academics," Journal of Monetary Economics, Elsevier, vol. 120(C), pages 1-20.
    7. Koessler, Ann-Kathrin & Page, Lionel & Dulleck, Uwe, 2015. "Promoting pro-social behavior with public statements of good intent," MPRA Paper 80072, University Library of Munich, Germany, revised 24 May 2017.
    8. David Pontille & Didier Torny, 2013. "Behind the scenes of scientific articles: defining categories of fraud and regulating cases," CSI Working Papers Series 031, Centre de Sociologie de l'Innovation (CSI), Mines ParisTech.
    9. Ádám Kun, 2018. "Publish and Who Should Perish: You or Science?," Publications, MDPI, vol. 6(2), pages 1-16, April.
    10. Herman Aguinis & Wayne F. Cascio & Ravi S. Ramani, 2017. "Science’s reproducibility and replicability crisis: International business is not immune," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 48(6), pages 653-663, August.
    11. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    12. Klaas Sijtsma, 2016. "Playing with Data—Or How to Discourage Questionable Research Practices and Stimulate Researchers to Do Things Right," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 1-15, March.
    13. S. P. J. M. Horbach & W. Halffman, 2019. "The ability of different peer review procedures to flag problematic publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 339-373, January.
    14. Salandra, Rossella, 2018. "Knowledge dissemination in clinical trials: Exploring influences of institutional support and type of innovation on selective reporting," Research Policy, Elsevier, vol. 47(7), pages 1215-1228.
    15. Koessler, Ann-Kathrin & Page, Lionel & Dulleck, Uwe, 2018. "Public Statements of Good Conduct Promote Pro-Social Behavior," EconStor Preprints 180669, ZBW - Leibniz Information Centre for Economics.
    16. Mads P. Sørensen & Tine Ravn & Ana Marušić & Andrea Reyes Elizondo & Panagiotis Kavouras & Joeri K. Tijdink & Anna-Kathrine Bendtsen, 2021. "Strengthening research integrity: which topic areas should organisations focus on?," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-15, December.
    17. Aguinis, Herman & Banks, George C. & Rogelberg, Steven G. & Cascio, Wayne F., 2020. "Actionable recommendations for narrowing the science-practice gap in open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 158(C), pages 27-35.
    18. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    19. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    20. Moustafa, Khaled, 2018. "Don't fall in common science pitfall!," FrenXiv ycjha, Center for Open Science.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:32p8t. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.