IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v14y2023i1d10.1038_s41467-023-41111-1.html
   My bibliography  Save this article

Survey of open science practices and attitudes in the social sciences

Author

Listed:
  • Joel Ferguson

    (University of California, Berkeley, Department of Agricultural and Resource Economics)

  • Rebecca Littman

    (University of Illinois Chicago, Department of Psychology)

  • Garret Christensen

    (Federal Deposit Insurance Corporation)

  • Elizabeth Levy Paluck

    (Princeton University, Department of Psychology)

  • Nicholas Swanson

    (University of California, Berkeley, Department of Economics)

  • Zenan Wang

    (University of California, Berkeley, Department of Economics)

  • Edward Miguel

    (University of California, Berkeley, Department of Economics)

  • David Birke

    (University of California, Berkeley, Department of Economics)

  • John-Henry Pezzuto

    (University of California, San Diego, Rady School of Management)

Abstract

Open science practices such as posting data or code and pre-registering analyses are increasingly prescribed and debated in the applied sciences, but the actual popularity and lifetime usage of these practices remain unknown. This study provides an assessment of attitudes toward, use of, and perceived norms regarding open science practices from a sample of authors published in top-10 (most-cited) journals and PhD students in top-20 ranked North American departments from four major social science disciplines: economics, political science, psychology, and sociology. We observe largely favorable private attitudes toward widespread lifetime usage (meaning that a researcher has used a particular practice at least once) of open science practices. As of 2020, nearly 90% of scholars had ever used at least one such practice. Support for posting data or code online is higher (88% overall support and nearly at the ceiling in some fields) than support for pre-registration (58% overall). With respect to norms, there is evidence that the scholars in our sample appear to underestimate the use of open science practices in their field. We also document that the reported lifetime prevalence of open science practices increased from 49% in 2010 to 87% a decade later.

Suggested Citation

  • Joel Ferguson & Rebecca Littman & Garret Christensen & Elizabeth Levy Paluck & Nicholas Swanson & Zenan Wang & Edward Miguel & David Birke & John-Henry Pezzuto, 2023. "Survey of open science practices and attitudes in the social sciences," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
  • Handle: RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-41111-1
    DOI: 10.1038/s41467-023-41111-1
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-023-41111-1
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-023-41111-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    2. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    3. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    4. Garret Christensen & Allan Dafoe & Edward Miguel & Don A Moore & Andrew K Rose, 2019. "A study of the impact of data sharing on article citations using journal policies as a natural experiment," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-13, December.
    5. Courtney K. Soderberg & Timothy M. Errington & Sarah R. Schiavone & Julia Bottesini & Felix Singleton Thorn & Simine Vazire & Kevin M. Esterling & Brian A. Nosek, 2021. "Initial evidence of research quality of registered reports compared with the standard publishing model," Nature Human Behaviour, Nature, vol. 5(8), pages 990-997, August.
    6. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    2. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).
    3. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    4. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.
    5. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    6. Brown, Martin & Hentschel, Nicole & Mettler, Hannes & Stix, Helmut, 2022. "The convenience of electronic payments and consumer cash demand," Journal of Monetary Economics, Elsevier, vol. 130(C), pages 86-102.
    7. Carina Neisser, 2021. "The Elasticity of Taxable Income: A Meta-Regression Analysis [The top 1% in international and historical perspective]," The Economic Journal, Royal Economic Society, vol. 131(640), pages 3365-3391.
    8. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    9. Brüderle, Mirjam Anna & Peters, Jörg & Roberts, Gareth, 2022. "Weather and crime: Cautious evidence from South Africa," Ruhr Economic Papers 940, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    10. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    11. Emilio Depetris-Chauvin & Felipe González, 2023. "The Political Consequences of Vaccines: Quasi-experimental Evidence from Eligibility Rules," Documentos de Trabajo 572, Instituto de Economia. Pontificia Universidad Católica de Chile..
    12. Ferman, Bruno & Ponczek, Vladimir, 2017. "Should we drop covariate cells with attrition problems?," MPRA Paper 80686, University Library of Munich, Germany.
    13. , 2023. "The Political Consequences of Vaccines: Quasi-experimental Evidence from Eligibility Rules," Working Papers 953, Queen Mary University of London, School of Economics and Finance.
    14. Bensch, Gunther & Ankel-Peters, Jörg & Vance, Colin, 2023. "Spotlight on Researcher Decisions – Infrastructure Evaluation, Instrumental Variables, and Specification Screening," VfS Annual Conference 2023 (Regensburg): Growth and the "sociale Frage" 277703, Verein für Socialpolitik / German Economic Association.
    15. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    16. Ankel-Peters, Jörg & Vance, Colin & Bensch, Gunther, 2022. "Spotlight on researcher decisions – Infrastructure evaluation, instrumental variables, and first-stage specification screening," OSF Preprints sw6kd, Center for Open Science.
    17. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    18. Clemens, Jeffrey & Strain, Michael R., 2021. "The Heterogeneous Effects of Large and Small Minimum Wage Changes: Evidence over the Short and Medium Run Using a Pre-analysis Plan," IZA Discussion Papers 14747, Institute of Labor Economics (IZA).
    19. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    20. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-41111-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.