IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0144151.html
   My bibliography  Save this article

The Use and Abuse of Transcranial Magnetic Stimulation to Modulate Corticospinal Excitability in Humans

Author

Listed:
  • Martin E Héroux
  • Janet L Taylor
  • Simon C Gandevia

Abstract

The magnitude and direction of reported physiological effects induced using transcranial magnetic stimulation (TMS) to modulate human motor cortical excitability have proven difficult to replicate routinely. We conducted an online survey on the prevalence and possible causes of these reproducibility issues. A total of 153 researchers were identified via their publications and invited to complete an anonymous internet-based survey that asked about their experience trying to reproduce published findings for various TMS protocols. The prevalence of questionable research practices known to contribute to low reproducibility was also determined. We received 47 completed surveys from researchers with an average of 16.4 published papers (95% CI 10.8–22.0) that used TMS to modulate motor cortical excitability. Respondents also had a mean of 4.0 (2.5–5.7) relevant completed studies that would never be published. Across a range of TMS protocols, 45–60% of respondents found similar results to those in the original publications; the other respondents were able to reproduce the original effects only sometimes or not at all. Only 20% of respondents used formal power calculations to determine study sample sizes. Others relied on previously published studies (25%), personal experience (24%) or flexible post-hoc criteria (41%). Approximately 44% of respondents knew researchers who engaged in questionable research practices (range 32–70%), yet only 18% admitted to engaging in them (range 6–38%). These practices included screening subjects to find those that respond in a desired way to a TMS protocol, selectively reporting results and rejecting data based on a gut feeling. In a sample of 56 published papers that were inspected, not a single questionable research practice was reported. Our survey revealed that approximately 50% of researchers are unable to reproduce published TMS effects. Researchers need to start increasing study sample size and eliminating—or at least reporting—questionable research practices in order to make the outcomes of TMS research reproducible.

Suggested Citation

  • Martin E Héroux & Janet L Taylor & Simon C Gandevia, 2015. "The Use and Abuse of Transcranial Magnetic Stimulation to Modulate Corticospinal Excitability in Humans," PLOS ONE, Public Library of Science, vol. 10(12), pages 1-10, December.
  • Handle: RePEc:plo:pone00:0144151
    DOI: 10.1371/journal.pone.0144151
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0144151
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0144151&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0144151?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Regina Nuzzo, 2014. "Scientific method: Statistical errors," Nature, Nature, vol. 506(7487), pages 150-152, February.
    3. Jonathan Schooler, 2011. "Unpublished results hide the decline effect," Nature, Nature, vol. 470(7335), pages 437-437, February.
    4. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    5. Warwick P. Anderson, 2015. "Stamp out shabby research conduct," Nature, Nature, vol. 519(7542), pages 158-158, March.
    6. Kerry Dwan & Carrol Gamble & Paula R Williamson & Jamie J Kirkham & the Reporting Bias Group, 2013. "Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias — An Updated Review," PLOS ONE, Public Library of Science, vol. 8(7), pages 1-37, July.
    7. Luke Holman & Megan L Head & Robert Lanfear & Michael D Jennions, 2015. "Evidence of Experimental Bias in the Life Sciences: Why We Need Blind Data Recording," PLOS Biology, Public Library of Science, vol. 13(7), pages 1-12, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Wanja Wolff & Lorena Baumann & Chris Englert, 2018. "Self-reports from behind the scenes: Questionable research practices and rates of replication in ego depletion research," PLOS ONE, Public Library of Science, vol. 13(6), pages 1-11, June.
    2. Martin E Héroux & Colleen K Loo & Janet L Taylor & Simon C Gandevia, 2017. "Questionable science and reproducibility in electrical brain stimulation research," PLOS ONE, Public Library of Science, vol. 12(4), pages 1-11, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    2. Stephan B Bruns & John P A Ioannidis, 2016. "p-Curve and p-Hacking in Observational Research," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-13, February.
    3. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    4. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    5. Louis Anthony (Tony) Cox, 2015. "Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks," Risk Analysis, John Wiley & Sons, vol. 35(10), pages 1892-1910, October.
    6. Lynch, John G. & Bradlow, Eric T. & Huber, Joel C. & Lehmann, Donald R., 2015. "Reflections on the replication corner: In praise of conceptual replications," International Journal of Research in Marketing, Elsevier, vol. 32(4), pages 333-342.
    7. Lars Ole Schwen & Sabrina Rueschenbaum, 2018. "Ten quick tips for getting the most scientific value out of numerical data," PLOS Computational Biology, Public Library of Science, vol. 14(10), pages 1-21, October.
    8. Mangirdas Morkunas & Elzė Rudienė & Lukas Giriūnas & Laura Daučiūnienė, 2020. "Assessment of Factors Causing Bias in Marketing- Related Publications," Publications, MDPI, vol. 8(4), pages 1-16, October.
    9. Isabelle Bartram & Jonathan M Jeschke, 2019. "Do cancer stem cells exist? A pilot study combining a systematic review with the hierarchy-of-hypotheses approach," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-12, December.
    10. Adriano Koshiyama & Nick Firoozye, 2019. "Avoiding Backtesting Overfitting by Covariance-Penalties: an empirical investigation of the ordinary and total least squares cases," Papers 1905.05023, arXiv.org.
    11. Maren Duvendack & Richard Palmer-Jones, 2013. "Replication of quantitative work in development studies: Experiences and suggestions," Progress in Development Studies, , vol. 13(4), pages 307-322, October.
    12. Bialek, Michal & Misiak, Michał & Dziekan, Martyna, 2021. "Why is the statistical revolution not progressing? Vicious cycle of the scientific reform," OSF Preprints gmfs9, Center for Open Science.
    13. Salandra, Rossella & Criscuolo, Paola & Salter, Ammon, 2021. "Directing scientists away from potentially biased publications: the role of systematic reviews in health care," Research Policy, Elsevier, vol. 50(1).
    14. Eleni Verykouki & Christos T. Nakas, 2023. "Adaptations on the Use of p -Values for Statistical Inference: An Interpretation of Messages from Recent Public Discussions," Stats, MDPI, vol. 6(2), pages 1-13, April.
    15. Jeremy Arkes, 2020. "Teaching Graduate (and Undergraduate) Econometrics: Some Sensible Shifts to Improve Efficiency, Effectiveness, and Usefulness," Econometrics, MDPI, vol. 8(3), pages 1-23, September.
    16. Piotr Bialowolski & Dorota Weziak-Bialowolska & Eileen McNeely, 2021. "The Role of Financial Fragility and Financial Control for Well-Being," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 155(3), pages 1137-1157, June.
    17. Tracey L Weissgerber & Vesna D Garovic & Jelena S Milin-Lazovic & Stacey J Winham & Zoran Obradovic & Jerome P Trzeciakowski & Natasa M Milic, 2016. "Reinventing Biostatistics Education for Basic Scientists," PLOS Biology, Public Library of Science, vol. 14(4), pages 1-12, April.
    18. Klaus E Meyer & Arjen Witteloostuijn & Sjoerd Beugelsdijk, 2017. "What’s in a p? Reassessing best practices for conducting and reporting hypothesis-testing research," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 48(5), pages 535-551, July.
    19. David G Jenkins & Pedro F Quintana-Ascencio, 2020. "A solution to minimum sample size for regressions," PLOS ONE, Public Library of Science, vol. 15(2), pages 1-15, February.
    20. Stephen A. Gorman & Frank J. Fabozzi, 2021. "The ABC’s of the alternative risk premium: academic roots," Journal of Asset Management, Palgrave Macmillan, vol. 22(6), pages 405-436, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0144151. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.