IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0299617.html
   My bibliography  Save this article

Common misconceptions held by health researchers when interpreting linear regression assumptions, a cross-sectional study

Author

Listed:
  • Lee Jones
  • Adrian Barnett
  • Dimitrios Vagenas

Abstract

Background:: Statistical models are valuable tools for interpreting complex relationships within health systems. These models rely on a framework of statistical assumptions that, when correctly addressed, enable valid inferences and conclusions. However, failure to appropriately address these assumptions can lead to flawed analyses, resulting in misleading conclusions and contributing to the adoption of ineffective or harmful treatments and poorer health outcomes. This study examines researchers’ understanding of the widely used linear regression model, focusing on assumptions, common misconceptions, and recommendations for improving research practices. Methods:: One hundred papers were randomly sampled from the journal PLOS ONE, which used linear regression in the materials and methods section and were from the health and biomedical field in 2019. Two independent volunteer statisticians rated each paper for the reporting of linear regression assumptions. The prevalence of assumptions reported by authors was described using frequencies, percentages, and 95% confidence intervals. The agreement of statistical raters was assessed using Gwet’s statistic. Results:: Of the 95 papers that met the inclusion and exclusion criteria, only 37% reported checking any linear regression assumptions, 22% reported checking one assumption, and no authors checked all assumptions. The biggest misconception was that the Y variable should be checked for normality, with only 5 of the 28 papers correctly checking the residuals for normality. Conclusion:: The reporting of linear regression assumptions is alarmingly low. When assumptions are checked, the reporting is often inadequate or incorrectly checked. Addressing these issues requires a cultural shift in research practices, including improved statistical training, more rigorous journal review processes, and a broader understanding of regression as a unifying framework. Greater emphasis must be placed on evaluating model assumptions and their implications rather than the rote application of statistical methods. Careful consideration of assumptions helps improve the reliability of statistical conclusions, reducing the risk of misleading findings influencing clinical practice and potentially affecting patient outcomes.

Suggested Citation

  • Lee Jones & Adrian Barnett & Dimitrios Vagenas, 2025. "Common misconceptions held by health researchers when interpreting linear regression assumptions, a cross-sectional study," PLOS ONE, Public Library of Science, vol. 20(6), pages 1-28, June.
  • Handle: RePEc:plo:pone00:0299617
    DOI: 10.1371/journal.pone.0299617
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0299617
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0299617&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0299617?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Julia Jerke & Antonia Velicu & Fabian Winter & Heiko Rauhut, 2025. "Publication bias in the social sciences since 1959: Application of a regression discontinuity framework," PLOS ONE, Public Library of Science, vol. 20(2), pages 1-27, February.
    2. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    3. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    4. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    5. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    6. Dominika Ehrenbergerova & Josef Bajzik, 2020. "The Effect of Monetary Policy on House Prices - How Strong is the Transmission?," Working Papers 2020/14, Czech National Bank, Research and Statistics Department.
    7. Bajzík, Josef & Havranek, Tomas & Irsova, Zuzana & Novak, Jiri, 2023. "Does Shareholder Activism Create Value? A Meta-Analysis," CEPR Discussion Papers 18233, C.E.P.R. Discussion Papers.
    8. Cefis, Elena & Coad, Alex & Lucini-Paioni, Alessandro, 2023. "Landmarks as lighthouses: firms' innovation and modes of exit during the business cycle," Research Policy, Elsevier, vol. 52(8).
    9. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    10. Buehling, Kilian, 2021. "Changing research topic trends as an effect of publication rankings – The case of German economists and the Handelsblatt Ranking," Journal of Informetrics, Elsevier, vol. 15(3).
    11. Salandra, Rossella & Criscuolo, Paola & Salter, Ammon, 2021. "Directing scientists away from potentially biased publications: the role of systematic reviews in health care," Research Policy, Elsevier, vol. 50(1).
    12. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    13. repec:osf:metaar:mbx62_v1 is not listed on IDEAS
    14. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    15. Bruns, Stephan & Herwartz, Helmut & Ioannidis, John P.A. & Islam, Chris-Gabriel & Raters, Fabian H. C., 2023. "Statistical reporting errors in economics," MetaArXiv mbx62, Center for Open Science.
    16. Bruns, Stephan B. & Ioannidis, John P.A., 2020. "Determinants of economic growth: Different time different answer?," Journal of Macroeconomics, Elsevier, vol. 63(C).
    17. Ebersberger, Bernd & Galia, Fabrice & Laursen, Keld & Salter, Ammon, 2021. "Inbound Open Innovation and Innovation Performance: A Robustness Study," Research Policy, Elsevier, vol. 50(7).
    18. Simona Malovaná & Martin Hodula & Zuzana Gric & Josef Bajzík, 2025. "Borrower‐based macroprudential measures and credit growth: How biased is the existing literature?," Journal of Economic Surveys, Wiley Blackwell, vol. 39(1), pages 66-102, February.
    19. Bruns, Stephan B. & Kalthaus, Martin, 2020. "Flexibility in the selection of patent counts: Implications for p-hacking and evidence-based policymaking," Research Policy, Elsevier, vol. 49(1).
    20. Doucouliagos, Hristos & Hinz, Thomas & Zigova, Katarina, 2022. "Bias and careers: Evidence from the aid effectiveness literature," European Journal of Political Economy, Elsevier, vol. 71(C).
    21. Bajzik, Josef, 2021. "Trading volume and stock returns: A meta-analysis," International Review of Financial Analysis, Elsevier, vol. 78(C).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0299617. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.