IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/s4b65.html
   My bibliography  Save this paper

Replication success under questionable research practices – a simulation study

Author

Listed:
  • Freuli, Francesca
  • Held, Leonhard
  • Heyard, Rachel

Abstract

Increasing evidence suggests that the reproducibility and replicability of scientific findings is threatened by researchers employing questionable research practices (QRP) in order to achieve publishable, positive and significant results. Numerous metrics have been developed to determine replication success but it has not yet been established how well those metrics perform in the presence of QRPs. This paper aims to compare the performance of different metrics quantifying replication success in the presence of four different types of QRPs: cherry picking, questionable interim analyses, questionable inclusion of covariates, and questionable subgroup analyses. Our results show that the metric based on the golden sceptical p -value does better in maintaining low values of overall type-I error rate, but often needs larger replication sample sizes, especially when severe QRPs are employed.

Suggested Citation

  • Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication success under questionable research practices – a simulation study," MetaArXiv s4b65, Center for Open Science.
  • Handle: RePEc:osf:metaar:s4b65
    DOI: 10.31219/osf.io/s4b65
    as

    Download full text from publisher

    File URL: https://osf.io/download/62ea3510136b5703f24545c3/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/s4b65?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Leonhard Held, 2020. "A new standard for the analysis and design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(2), pages 431-448, February.
    2. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    3. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    4. Erik W. van Zwet & Eric A. Cator, 2021. "The significance filter, the winner's curse and the need to shrink," Statistica Neerlandica, Netherlands Society for Statistics and Operations Research, vol. 75(4), pages 437-452, November.
    5. Dorothy Bishop, 2019. "Rein in the four horsemen of irreproducibility," Nature, Nature, vol. 568(7753), pages 435-435, April.
    6. Gowri Gopalakrishna & Gerben ter Riet & Gerko Vink & Ineke Stoop & Jelte M Wicherts & Lex M Bouter, 2022. "Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in The Netherlands," PLOS ONE, Public Library of Science, vol. 17(2), pages 1-16, February.
    7. Anne-Laure Boulesteix & Sabine Lauer & Manuel J A Eugster, 2013. "A Plea for Neutral Comparison Studies in Computational Sciences," PLOS ONE, Public Library of Science, vol. 8(4), pages 1-11, April.
    8. Wanja Wolff & Lorena Baumann & Chris Englert, 2018. "Self-reports from behind the scenes: Questionable research practices and rates of replication in ego depletion research," PLOS ONE, Public Library of Science, vol. 13(6), pages 1-11, June.
    9. Larry V. Hedges & Jacob M. Schauer, 2019. "More Than One Replication Study Is Needed for Unambiguous Tests of Replication," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 543-570, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication Success under Questionable Research Practices - A Simulation Study," I4R Discussion Paper Series 2, The Institute for Replication (I4R).
    2. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    3. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    4. Cantone, Giulio Giacomo, 2023. "The multiversal methodology as a remedy of the replication crisis," MetaArXiv kuhmz, Center for Open Science.
    5. Craig, Russell & Cox, Adam & Tourish, Dennis & Thorpe, Alistair, 2020. "Using retracted journal articles in psychology to understand research misconduct in the social sciences: What is to be done?," Research Policy, Elsevier, vol. 49(4).
    6. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
    7. Arnaud Vaganay, 2016. "Cluster Sampling Bias in Government-Sponsored Evaluations: A Correlational Study of Employment and Welfare Pilots in England," PLOS ONE, Public Library of Science, vol. 11(8), pages 1-21, August.
    8. David Winkelmann & Marius Ötting & Christian Deutscher & Tomasz Makarewicz, 2024. "Are Betting Markets Inefficient? Evidence From Simulations and Real Data," Journal of Sports Economics, , vol. 25(1), pages 54-97, January.
    9. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    10. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    11. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    12. Martin E Héroux & Janet L Taylor & Simon C Gandevia, 2015. "The Use and Abuse of Transcranial Magnetic Stimulation to Modulate Corticospinal Excitability in Humans," PLOS ONE, Public Library of Science, vol. 10(12), pages 1-10, December.
    13. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    14. Dario Krpan & Jonathan E. Booth & Andreea Damien, 2023. "The positive–negative–competence (PNC) model of psychological responses to representations of robots," Nature Human Behaviour, Nature, vol. 7(11), pages 1933-1954, November.
    15. Tracey L Weissgerber, 2021. "Learning from the past to develop data analysis curricula for the future," PLOS Biology, Public Library of Science, vol. 19(7), pages 1-3, July.
    16. Muradchanian, Jasmine & Hoekstra, Rink & Kiers, Henk & van Ravenzwaaij, Don, 2020. "How Best to Quantify Replication Success? A Simulation Study on the Comparison of Replication Success Metrics," MetaArXiv wvdjf, Center for Open Science.
    17. van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2018. "P-uniform," MetaArXiv zqjr9, Center for Open Science.
    18. Nicky Agate & Rebecca Kennison & Stacy Konkiel & Christopher P. Long & Jason Rhody & Simone Sacchi & Penelope Weber, 2020. "The transformative power of values-enacted scholarship," Palgrave Communications, Palgrave Macmillan, vol. 7(1), pages 1-12, December.
    19. Feuz, Ryan, 2023. "Hedonic Price Analysis of Used Tractors," Applied Economics Teaching Resources (AETR), Agricultural and Applied Economics Association, vol. 5(1), January.
    20. Sarstedt, Marko & Adler, Susanne J., 2023. "An advanced method to streamline p-hacking," Journal of Business Research, Elsevier, vol. 163(C).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:s4b65. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.