IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0200303.html
   My bibliography  Save this article

Questionable research practices in ecology and evolution

Author

Listed:
  • Hannah Fraser
  • Tim Parker
  • Shinichi Nakagawa
  • Ashley Barnett
  • Fiona Fidler

Abstract

We surveyed 807 researchers (494 ecologists and 313 evolutionary biologists) about their use of Questionable Research Practices (QRPs), including cherry picking statistically significant results, p hacking, and hypothesising after the results are known (HARKing). We also asked them to estimate the proportion of their colleagues that use each of these QRPs. Several of the QRPs were prevalent within the ecology and evolution research community. Across the two groups, we found 64% of surveyed researchers reported they had at least once failed to report results because they were not statistically significant (cherry picking); 42% had collected more data after inspecting whether results were statistically significant (a form of p hacking) and 51% had reported an unexpected finding as though it had been hypothesised from the start (HARKing). Such practices have been directly implicated in the low rates of reproducible results uncovered by recent large scale replication studies in psychology and other disciplines. The rates of QRPs found in this study are comparable with the rates seen in psychology, indicating that the reproducibility problems discovered in psychology are also likely to be present in ecology and evolution.

Suggested Citation

  • Hannah Fraser & Tim Parker & Shinichi Nakagawa & Ashley Barnett & Fiona Fidler, 2018. "Questionable research practices in ecology and evolution," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-16, July.
  • Handle: RePEc:plo:pone00:0200303
    DOI: 10.1371/journal.pone.0200303
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0200303
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0200303&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0200303?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Dominique G Roche & Loeske E. B Kruuk, 2015. "Public Data Archiving in Ecology and Evolution: How Well are We Doing?," Working Papers id:7811, eSocialSciences.
    3. Mellor, David Thomas, 2017. "Preregistration and increased transparency will benefit science," OSF Preprints xsfam, Center for Open Science.
    4. Dominique G Roche & Loeske E B Kruuk & Robert Lanfear & Sandra A Binning, 2015. "Public Data Archiving in Ecology and Evolution: How Well Are We Doing?," PLOS Biology, Public Library of Science, vol. 13(11), pages 1-12, November.
    5. Malika Ihle & Isabel S. Winney & Anna Krystalli & Michael Croucher, 2017. "Striving for transparent and credible research: practical guidelines for behavioral ecologists," Behavioral Ecology, International Society for Behavioral Ecology, vol. 28(2), pages 348-354.
    6. Shinichi Nakagawa, 2004. "A farewell to Bonferroni: the problems of low statistical power and publication bias," Behavioral Ecology, International Society for Behavioral Ecology, vol. 15(6), pages 1044-1045, November.
    7. Andreas Schwab & Eric Abrahamson & William H. Starbuck & Fiona Fidler, 2011. "PERSPECTIVE---Researchers Should Make Thoughtful Assessments Instead of Null-Hypothesis Significance Tests," Organization Science, INFORMS, vol. 22(4), pages 1105-1120, August.
    8. Leonard P Freedman & Iain M Cockburn & Timothy S Simcoe, 2015. "The Economics of Reproducibility in Preclinical Research," PLOS Biology, Public Library of Science, vol. 13(6), pages 1-9, June.
    9. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    10. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Shinichi Nakagawa & Edward R. Ivimey-Cook & Matthew J. Grainger & Rose E. O’Dea & Samantha Burke & Szymon M. Drobniak & Elliot Gould & Erin L. Macartney & April Robin Martinig & Kyle Morrison & Matthi, 2023. "Method Reporting with Initials for Transparency (MeRIT) promotes more granularity and accountability for author contributions," Nature Communications, Nature, vol. 14(1), pages 1-5, December.
    2. Nathalie Percie du Sert & Viki Hurst & Amrita Ahluwalia & Sabina Alam & Marc T Avey & Monya Baker & William J Browne & Alejandra Clark & Innes C Cuthill & Ulrich Dirnagl & Michael Emerson & Paul Garne, 2020. "The ARRIVE guidelines 2.0: Updated guidelines for reporting animal research," PLOS Biology, Public Library of Science, vol. 18(7), pages 1-12, July.
    3. Tierney, Warren & Hardy, Jay H. & Ebersole, Charles R. & Leavitt, Keith & Viganola, Domenico & Clemente, Elena Giulia & Gordon, Michael & Dreber, Anna & Johannesson, Magnus & Pfeiffer, Thomas & Uhlman, 2020. "Creative destruction in science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 161(C), pages 291-309.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    2. Malika Ihle & Isabel S. Winney & Anna Krystalli & Michael Croucher, 2017. "Striving for transparent and credible research: practical guidelines for behavioral ecologists," Behavioral Ecology, International Society for Behavioral Ecology, vol. 28(2), pages 348-354.
    3. Karin Langenkamp & Bodo Rödel & Kerstin Taufenbach & Meike Weiland, 2018. "Open Access in Vocational Education and Training Research," Publications, MDPI, vol. 6(3), pages 1-12, July.
    4. Josip Strcic & Antonia Civljak & Terezija Glozinic & Rafael Leite Pacheco & Tonci Brkovic & Livia Puljak, 2022. "Open data and data sharing in articles about COVID-19 published in preprint servers medRxiv and bioRxiv," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2791-2802, May.
    5. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working Papers 2020-128, Becker Friedman Institute for Research In Economics.
    6. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.
    7. Christian Heise & Joshua M. Pearce, 2020. "From Open Access to Open Science: The Path From Scientific Reality to Open Scientific Communication," SAGE Open, , vol. 10(2), pages 21582440209, May.
    8. Vivian Leung & Frédérik Rousseau-Blass & Guy Beauchamp & Daniel S J Pang, 2018. "ARRIVE has not ARRIVEd: Support for the ARRIVE (Animal Research: Reporting of in vivo Experiments) guidelines does not improve the reporting quality of papers in animal welfare, analgesia or anesthesi," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-13, May.
    9. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    10. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    11. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    12. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    13. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    14. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    15. Joshua D. Carrell & Edward Hammill & Thomas C. Edwards, 2022. "Balancing Rare Species Conservation with Extractive Industries," Land, MDPI, vol. 11(11), pages 1-16, November.
    16. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    17. Christopher Allen & David M A Mehler, 2019. "Open science challenges, benefits and tips in early career and beyond," PLOS Biology, Public Library of Science, vol. 17(5), pages 1-14, May.
    18. Jeff Miller & Rolf Ulrich, 2019. "The quest for an optimal alpha," PLOS ONE, Public Library of Science, vol. 14(1), pages 1-13, January.
    19. Mark D Lindner & Richard K Nakamura, 2015. "Examining the Predictive Validity of NIH Peer Review Scores," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-12, June.
    20. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0200303. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.