IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/mbvz3.html
   My bibliography  Save this paper

Which findings should be published?

Author

Listed:
  • Kasy, Maximilian
  • Frankel, Alexander

Abstract

Given a scarcity of journal space, what is the socially optimal rule for whether an empirical finding should be published? Suppose that the goal of publication is to inform the public about a policy-relevant state. Then journals should publish extreme results, meaning ones that move beliefs sufficiently. For specific objectives, the optimal rule can take the form of a one- or a two-sided test comparing a point estimate to the prior mean, with critical values deter- mined by a cost-benefit analysis. An explicit consideration of future studies may additionally justify the publication of precise null results. If one insists that standard inference remain valid, however, publication must not select on the study’s findings (but may select on the study’s design).

Suggested Citation

  • Kasy, Maximilian & Frankel, Alexander, 2018. "Which findings should be published?," MetaArXiv mbvz3, Center for Open Science.
  • Handle: RePEc:osf:metaar:mbvz3
    DOI: 10.31219/osf.io/mbvz3
    as

    Download full text from publisher

    File URL: https://osf.io/download/5bfec2f748140b0018df3e52/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/mbvz3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    2. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    3. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    4. Emeric Henry & Marco Ottaviani, 2019. "Research and the Approval Process: The Organization of Persuasion," American Economic Review, American Economic Association, vol. 109(3), pages 911-955, March.
    5. Michaillat, Pascal & Akerlof, George, 2017. "Beetles: Biased Promotions and Persistence of False Belief," CEPR Discussion Papers 12514, C.E.P.R. Discussion Papers.
    6. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    7. Nicola Persico, 2000. "Information Acquisition in Auctions," Econometrica, Econometric Society, vol. 68(1), pages 135-148, January.
    8. David Card & Stefano DellaVigna, 2013. "Nine Facts about Top Journals in Economics," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 144-161, March.
    9. Isaiah Andrews & Jesse M. Shapiro, 2021. "A Model of Scientific Communication," Econometrica, Econometric Society, vol. 89(5), pages 2117-2142, September.
    10. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    11. Aleksey Tetenov, 2016. "An economic theory of statistical testing," CeMMAP working papers CWP50/16, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    12. Verrecchia, Robert E., 1983. "Discretionary disclosure," Journal of Accounting and Economics, Elsevier, vol. 5(1), pages 179-194, April.
    13. Vaart,A. W. van der, 2000. "Asymptotic Statistics," Cambridge Books, Cambridge University Press, number 9780521784504.
    14. Boyan Jovanovic, 1982. "Truthful Disclosure of Information," Bell Journal of Economics, The RAND Corporation, vol. 13(1), pages 36-44, Spring.
    15. Richard McElreath & Paul E Smaldino, 2015. "Replication, Communication, and the Population Dynamics of Scientific Discovery," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-16, August.
    16. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    17. Edward L. Glaeser, 2006. "Researcher Incentives and Empirical Methods," NBER Technical Working Papers 0329, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2023. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    2. Abhirup Bhunia, 2021. "Evidence-based Policy in India: Crossing the Long, Uphill Bridge," Journal of Development Policy and Practice, , vol. 6(2), pages 137-143, July.
    3. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    4. David M. Kaplan, 2019. "Unbiased Estimation as a Public Good," Working Papers 1911, Department of Economics, University of Missouri.
    5. Davide Viviano & Kaspar Wuthrich & Paul Niehaus, 2021. "When should you adjust inferences for multiple hypothesis testing?," Papers 2104.13367, arXiv.org, revised Dec 2023.
    6. Maximilian Kasy & Jann Spiess, 2022. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Papers 2208.09638, arXiv.org, revised Oct 2023.
    7. Maximilian Kasy & Jann Spiess, 2022. "Rationalizing Pre-Analysis Plans:Statistical Decisions Subject to Implementability," Economics Series Working Papers 975, University of Oxford, Department of Economics.
    8. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Jun 2023.
    9. Isaiah Andrews & Jesse M. Shapiro, 2021. "A Model of Scientific Communication," Econometrica, Econometric Society, vol. 89(5), pages 2117-2142, September.
    10. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    2. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2023. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    5. Brodeur, Abel & Cook, Nikolai & Neisser, Carina, 2022. "P-Hacking, Data Type and Data-Sharing Policy," IZA Discussion Papers 15586, Institute of Labor Economics (IZA).
    6. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    7. Doucouliagos, Hristos & Hinz, Thomas & Zigova, Katarina, 2022. "Bias and careers: Evidence from the aid effectiveness literature," European Journal of Political Economy, Elsevier, vol. 71(C).
    8. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    9. Patrick Vu, 2022. "Can the Replication Rate Tell Us About Publication Bias?," Papers 2206.15023, arXiv.org, revised Jul 2022.
    10. Bergemann, Dirk & Ottaviani, Marco, 2021. "Information Markets and Nonmarkets," CEPR Discussion Papers 16459, C.E.P.R. Discussion Papers.
    11. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    12. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    13. Maximilian Kasy & Jann Spiess, 2022. "Rationalizing Pre-Analysis Plans:Statistical Decisions Subject to Implementability," Economics Series Working Papers 975, University of Oxford, Department of Economics.
    14. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    15. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    16. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    17. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    18. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    19. Maximilian Kasy & Jann Spiess, 2022. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Papers 2208.09638, arXiv.org, revised Oct 2023.
    20. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.

    More about this item

    JEL classification:

    • D61 - Microeconomics - - Welfare Economics - - - Allocative Efficiency; Cost-Benefit Analysis
    • D82 - Microeconomics - - Information, Knowledge, and Uncertainty - - - Asymmetric and Private Information; Mechanism Design
    • D83 - Microeconomics - - Information, Knowledge, and Uncertainty - - - Search; Learning; Information and Knowledge; Communication; Belief; Unawareness
    • L82 - Industrial Organization - - Industry Studies: Services - - - Entertainment; Media

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:mbvz3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.