IDEAS home Printed from https://ideas.repec.org/a/spr/psycho/v81y2016i1p16-26.html
   My bibliography  Save this article

Thinking About Data, Research Methods, and Statistical Analyses: Commentary on Sijtsma’s (2014) “Playing with Data”

Author

Listed:
  • Irwin Waldman
  • Scott Lilienfeld

Abstract

We comment on Sijtsma’s (2014) thought-provoking essay on how to minimize questionable research practices (QRPs) in psychology. We agree with Sijtsma that proactive measures to decrease the risk of QRPs will ultimately be more productive than efforts to target individual researchers and their work. In particular, we concur that encouraging researchers to make their data and research materials public is the best institutional antidote against QRPs, although we are concerned that Sijtsma’s proposal to delegate more responsibility to statistical and methodological consultants could inadvertently reinforce the dichotomy between the substantive and statistical aspects of research. We also discuss sources of false-positive findings and replication failures in psychological research, and outline potential remedies for these problems. We conclude that replicability is the best metric of the minimization of QRPs and their adverse effects on psychological research. Copyright The Psychometric Society 2016

Suggested Citation

  • Irwin Waldman & Scott Lilienfeld, 2016. "Thinking About Data, Research Methods, and Statistical Analyses: Commentary on Sijtsma’s (2014) “Playing with Data”," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 16-26, March.
  • Handle: RePEc:spr:psycho:v:81:y:2016:i:1:p:16-26
    DOI: 10.1007/s11336-015-9447-z
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s11336-015-9447-z
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s11336-015-9447-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Herman Aguinis & Wayne F. Cascio & Ravi S. Ramani, 2017. "Science’s reproducibility and replicability crisis: International business is not immune," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 48(6), pages 653-663, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Matteo Colombo & Georgi Duev & Michèle B Nuijten & Jan Sprenger, 2018. "Statistical reporting inconsistencies in experimental philosophy," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-12, April.
    2. Klaas Sijtsma, 2016. "Playing with Data—Or How to Discourage Questionable Research Practices and Stimulate Researchers to Do Things Right," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 1-15, March.
    3. Aguinis, Herman & Banks, George C. & Rogelberg, Steven G. & Cascio, Wayne F., 2020. "Actionable recommendations for narrowing the science-practice gap in open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 158(C), pages 27-35.
    4. Kraft-Todd, Gordon T. & Rand, David G., 2021. "Practice what you preach: Credibility-enhancing displays and the growth of open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 164(C), pages 1-10.
    5. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    6. Colombo, Matteo & Duev, Georgi & Nuijten, M.B. & Sprenger, Jan, 2018. "Statistical reporting inconsistencies in experimental philosophy," Other publications TiSEM 075f5696-ae1a-4aae-9e17-c, Tilburg University, School of Economics and Management.
    7. Jyotirmoy Sarkar, 2018. "Will P†Value Triumph over Abuses and Attacks?," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 7(4), pages 66-71, July.
    8. Kevin J. Boyle & Mark Morrison & Darla Hatton MacDonald & Roderick Duncan & John Rose, 2016. "Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 64(3), pages 401-419, July.
    9. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    10. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    11. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    12. Aurelie Seguin & Wolfgang Forstmeier, 2012. "No Band Color Effects on Male Courtship Rate or Body Mass in the Zebra Finch: Four Experiments and a Meta-Analysis," PLOS ONE, Public Library of Science, vol. 7(6), pages 1-11, June.
    13. Dragana Radicic & Geoffrey Pugh & Hugo Hollanders & René Wintjes & Jon Fairburn, 2016. "The impact of innovation support programs on small and medium enterprises innovation in traditional manufacturing industries: An evaluation for seven European Union regions," Environment and Planning C, , vol. 34(8), pages 1425-1452, December.
    14. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    15. Li, Lunzheng & Maniadis, Zacharias & Sedikides, Constantine, 2021. "Anchoring in Economics: A Meta-Analysis of Studies on Willingness-To-Pay and Willingness-To-Accept," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 90(C).
    16. Diekmann Andreas, 2011. "Are Most Published Research Findings False?," Journal of Economics and Statistics (Jahrbuecher fuer Nationaloekonomie und Statistik), De Gruyter, vol. 231(5-6), pages 628-635, October.
    17. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    18. Kirthi Kalyanam & John McAteer & Jonathan Marek & James Hodges & Lifeng Lin, 2018. "Cross channel effects of search engine advertising on brick & mortar retail sales: Meta analysis of large scale field experiments on Google.com," Quantitative Marketing and Economics (QME), Springer, vol. 16(1), pages 1-42, March.
    19. Nazila Alinaghi & W. Robert Reed, 2021. "Taxes and Economic Growth in OECD Countries: A Meta-analysis," Public Finance Review, , vol. 49(1), pages 3-40, January.
    20. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:psycho:v:81:y:2016:i:1:p:16-26. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.