IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/49yst.html
   My bibliography  Save this paper

Identification of and correction for publication bias

Author

Listed:
  • Kasy, Maximilian
  • Andrews, Isaiah

Abstract

Some empirical results are more likely to be published than others. Selective publication leads to biased estimates and distorted inference. We propose two approaches for identifying the conditional probability of publication as a function of a study’s results, the first based on systematic replication studies and the second on meta-studies. For known conditional publication probabilities, we propose bias-corrected estimators and confidence sets. We apply our methods to recent replication studies in experimental economics and psychology, and to a meta-study on the effect of the minimum wage. When replication and meta-study data are available, we find similar results from both.

Suggested Citation

  • Kasy, Maximilian & Andrews, Isaiah, 2018. "Identification of and correction for publication bias," MetaArXiv 49yst, Center for Open Science.
  • Handle: RePEc:osf:metaar:49yst
    DOI: 10.31219/osf.io/49yst
    as

    Download full text from publisher

    File URL: https://osf.io/download/5bfec3c135444800196313d7/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/49yst?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    2. Stefano DellaVigna & Devin Pope, 2018. "Predicting Experimental Results: Who Knows What?," Journal of Political Economy, University of Chicago Press, vol. 126(6), pages 2410-2456.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    5. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    6. Daniel Yekutieli, 2012. "Adjusted Bayesian inference for selected parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 74(3), pages 515-541, June.
    7. Card, David & Krueger, Alan B, 1995. "Time-Series Minimum-Wage Studies: A Meta-analysis," American Economic Review, American Economic Association, vol. 85(2), pages 238-243, May.
    8. Croke,Kevin & Hicks,Joan Hamory & Hsu,Eric & Kremer,Michael Robert & Miguel,Edward A., 2016. "Does mass deworming affect child nutrition ? meta-analysis, cost-effectiveness, and statistical power," Policy Research Working Paper Series 7921, The World Bank.
    9. De Long, J Bradford & Lang, Kevin, 1992. "Are All Economic Hypotheses False?," Journal of Political Economy, University of Chicago Press, vol. 100(6), pages 1257-1272, December.
    10. Michael A. Clemens, 2017. "The Meaning Of Failed Replications: A Review And Proposal," Journal of Economic Surveys, Wiley Blackwell, vol. 31(1), pages 326-342, February.
    11. Tomáš Havránek, 2015. "Measuring Intertemporal Substitution: The Importance Of Method Choices And Selective Reporting," Journal of the European Economic Association, European Economic Association, vol. 13(6), pages 1180-1204, December.
    12. Kevin Croke & Joan Hamory Hicks & Eric Hsu & Michael Kremer & Ricardo Maertens & Edward Miguel & Witold Więcek, 2016. "Meta-Analysis and Public Policy: Reconciling the Evidence on Deworming," NBER Working Papers 22382, National Bureau of Economic Research, Inc.
    13. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    14. T. D. Stanley, 2008. "Meta‐Regression Methods for Detecting and Estimating Empirical Effects in the Presence of Publication Selection," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 70(1), pages 103-127, February.
    15. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    16. Hristos Doucouliagos & T. D. Stanley, 2009. "Publication Selection Bias in Minimum‐Wage Research? A Meta‐Regression Analysis," British Journal of Industrial Relations, London School of Economics, vol. 47(2), pages 406-428, June.
    17. Stephan B. Bruns, 2017. "Meta-Regression Models and Observational Research," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 79(5), pages 637-653, October.
    18. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    19. Hou, Kewei & Xue, Chen & Zhang, Lu, 2017. "Replicating Anomalies," Working Paper Series 2017-10, Ohio State University, Charles A. Dice Center for Research in Financial Economics.
    20. Croke,Kevin & Hicks,Joan Hamory & Hsu,Eric & Kremer,Michael Robert & Miguel,Edward A., 2016. "Does mass deworming affect child nutrition ? meta-analysis, cost-effectiveness, and statistical power," Policy Research Working Paper Series 7921, The World Bank.
    21. Müller, Ulrich K. & Wang, Yulong, 2019. "Nearly weighted risk minimal unbiased estimation," Journal of Econometrics, Elsevier, vol. 209(1), pages 18-34.
    22. Michael Clemens, 2015. "The Meaning of Failed Replications: A Review and Proposal - Working Paper 399," Working Papers 399, Center for Global Development.
    23. James H. Stock & Jonathan Wright, 2000. "GMM with Weak Identification," Econometrica, Econometric Society, vol. 68(5), pages 1055-1096, September.
    24. Justin McCrary & Garret Christensen & Daniele Fanelli, 2016. "Conservative Tests under Satisficing Models of Publication Bias," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-10, February.
    25. Honore, Bo E. & Powell, James L., 1994. "Pairwise difference estimators of censored and truncated regression models," Journal of Econometrics, Elsevier, vol. 64(1-2), pages 241-278.
    26. Andrews, Donald W K, 1993. "Exactly Median-Unbiased Estimation of First Order Autoregressive/Unit Root Models," Econometrica, Econometric Society, vol. 61(1), pages 139-165, January.
    27. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Tomas Havranek & Zuzana Irsova & Lubica Laslopova & Olesia Zeynalova, 2020. "Skilled and Unskilled Labor Are Less Substitutable than Commonly Thought," Working Papers IES 2020/29, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    6. Cazachevici, Alina & Havranek, Tomas & Horvath, Roman, 2020. "Remittances and economic growth: A meta-analysis," World Development, Elsevier, vol. 134(C).
    7. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    8. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    9. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    10. Zigraiova, Diana & Havranek, Tomas & Irsova, Zuzana & Novak, Jiri, 2021. "How puzzling is the forward premium puzzle? A meta-analysis," European Economic Review, Elsevier, vol. 134(C).
    11. Snyder, Christopher & Zhuo, Ran, 2018. "Sniff Tests in Economics: Aggregate Distribution of Their Probability Values and Implications for Publication Bias," MetaArXiv 8vdrh, Center for Open Science.
    12. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    13. Tomas Havranek & Anna Sokolova, 2020. "Do Consumers Really Follow a Rule of Thumb? Three Thousand Estimates from 144 Studies Say 'Probably Not'," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 35, pages 97-122, January.
    14. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    15. Sebastian Gechert & Tomas Havranek & Zuzana Irsova & Dominika Kolcunova, 2022. "Measuring Capital-Labor Substitution: The Importance of Method Choices and Publication Bias," Review of Economic Dynamics, Elsevier for the Society for Economic Dynamics, vol. 45, pages 55-82, July.
    16. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working Papers 2020-128, Becker Friedman Institute for Research In Economics.
    17. Nazila Alinaghi & W. Robert Reed, 2021. "Taxes and Economic Growth in OECD Countries: A Meta-analysis," Public Finance Review, , vol. 49(1), pages 3-40, January.
    18. Lan Nguyen, Thi Mai & Papyrakis, Elissaios & van Bergeijk, Peter A.G., 2021. "Publication bias in the price effects of monetary policy: A meta-regression analysis for emerging and developing economies," International Review of Economics & Finance, Elsevier, vol. 71(C), pages 567-583.
    19. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    20. Stephan B. Bruns & David I. Stern, 2019. "Lag length selection and p-hacking in Granger causality testing: prevalence and performance of meta-regression models," Empirical Economics, Springer, vol. 56(3), pages 797-830, March.

    More about this item

    JEL classification:

    • C13 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Estimation: General
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions
    • J23 - Labor and Demographic Economics - - Demand and Supply of Labor - - - Labor Demand
    • J38 - Labor and Demographic Economics - - Wages, Compensation, and Labor Costs - - - Public Policy
    • L82 - Industrial Organization - - Industry Studies: Services - - - Entertainment; Media

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:49yst. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://osf.io/preprints/metaarxiv .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.