IDEAS home Printed from https://ideas.repec.org/a/eee/respol/v48y2019i925.html
   My bibliography  Save this article

Reporting errors and biases in published empirical findings: Evidence from innovation research

Author

Listed:
  • Bruns, Stephan B.
  • Asanov, Igor
  • Bode, Rasmus
  • Dunger, Melanie
  • Funk, Christoph
  • Hassan, Sherif M.
  • Hauschildt, Julia
  • Heinisch, Dominik
  • Kempa, Karol
  • König, Johannes
  • Lips, Johannes
  • Verbeck, Matthias
  • Wolfschütz, Eva
  • Buenstorf, Guido

Abstract

Errors and biases in published results compromise the reliability of empirical research, posing threats to the cumulative research process and to evidence-based decision making. We provide evidence on reporting errors and biases in innovation research. We find that 45% of the articles in our sample contain at least one result for which the provided statistical information is not consistent with reported significance levels. In 25% of the articles, at least one strong reporting error is diagnosed where a statistically non-significant finding becomes significant or vice versa using the common significance threshold of 0.1. The error rate at the test level is very small with 4.0% exhibiting any error and 1.4% showing strong errors. We also find systematically more marginally significant findings compared to marginally non-significant findings at the 0.05 and 0.1 thresholds of statistical significance. These discontinuities indicate the presence of reporting biases. Explorative analysis suggests that discontinuities are related to authors’ affiliations and to a lesser extent the article’s rank in the issue and the style of reporting.

Suggested Citation

  • Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
  • Handle: RePEc:eee:respol:v:48:y:2019:i:9:25
    DOI: 10.1016/j.respol.2019.05.005
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0048733319301076
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.respol.2019.05.005?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hall, Jeremy & Martin, Ben R., 2019. "Towards a taxonomy of research misconduct: The case of business school research," Research Policy, Elsevier, vol. 48(2), pages 414-427.
    2. Fagerberg, Jan & Verspagen, Bart, 2009. "Innovation studies--The emerging structure of a new scientific field," Research Policy, Elsevier, vol. 38(2), pages 218-233, March.
    3. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    4. Carl Berning & Bernd Weiß, 2016. "Erratum to: Publication bias in the German social sciences: an application of the caliper test to three top-tier German social science journals," Quality & Quantity: International Journal of Methodology, Springer, vol. 50(2), pages 919-920, March.
    5. Seeber, Marco & Cattaneo, Mattia & Meoli, Michele & Malighetti, Paolo, 2019. "Self-citations as strategic response to the use of metrics for career decisions," Research Policy, Elsevier, vol. 48(2), pages 478-491.
    6. Michael A. Clemens, 2017. "The Meaning Of Failed Replications: A Review And Proposal," Journal of Economic Surveys, Wiley Blackwell, vol. 31(1), pages 326-342, February.
    7. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    8. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    9. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    10. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    11. Necker, Sarah, 2014. "Scientific misbehavior in economics," Research Policy, Elsevier, vol. 43(10), pages 1747-1759.
    12. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    13. Eric Luis Uhlmann & Anthony Bastardi & Lee Ross, 2011. "Wishful Thinking: Belief, Desire, and the Motivated Evaluation of Scientific Evidence," Post-Print hal-00609541, HAL.
    14. Eva Vivalt, 2019. "Specification Searching and Significance Inflation Across Time, Methods and Disciplines," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 81(4), pages 797-816, August.
    15. Bruns, Stephan B. & König, Johannes & Stern, David I., 2019. "Replication and robustness analysis of ‘energy and economic growth in the USA: A multivariate approach’," Energy Economics, Elsevier, vol. 82(C), pages 100-113.
    16. Ronald L. Wasserstein & Nicole A. Lazar, 2016. "The ASA's Statement on p -Values: Context, Process, and Purpose," The American Statistician, Taylor & Francis Journals, vol. 70(2), pages 129-133, May.
    17. Ashish Arora & Michelle Gittelman & Sarah Kaplan & John Lynch & Will Mitchell & Nicolaj Siggelkow & Brent Goldfarb & Andrew A. King, 2016. "Scientific apophenia in strategic management research: Significance tests & mistaken inference," Strategic Management Journal, Wiley Blackwell, vol. 37(1), pages 167-176, January.
    18. Richard A. Bettis, 2012. "The search for asterisks: Compromised statistical tests and flawed theories," Strategic Management Journal, Wiley Blackwell, vol. 33(1), pages 108-113, January.
    19. Deirdre N. McCloskey & Stephen T. Ziliak, 1996. "The Standard Error of Regressions," Journal of Economic Literature, American Economic Association, vol. 34(1), pages 97-114, March.
    20. Andrew C. Chang & Phillip Li, 2017. "A Preanalysis Plan to Replicate Sixty Economics Research Papers That Worked Half of the Time," American Economic Review, American Economic Association, vol. 107(5), pages 60-64, May.
    21. Daniele Fanelli, 2010. "Do Pressures to Publish Increase Scientists' Bias? An Empirical Support from US States Data," PLOS ONE, Public Library of Science, vol. 5(4), pages 1-7, April.
    22. Anton Kühberger & Astrid Fritz & Thomas Scherndl, 2014. "Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size," PLOS ONE, Public Library of Science, vol. 9(9), pages 1-8, September.
    23. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    24. Marcus R. Munafò & Brian A. Nosek & Dorothy V. M. Bishop & Katherine S. Button & Christopher D. Chambers & Nathalie Percie du Sert & Uri Simonsohn & Eric-Jan Wagenmakers & Jennifer J. Ware & John P. A, 2017. "A manifesto for reproducible science," Nature Human Behaviour, Nature, vol. 1(1), pages 1-9, January.
    25. Carl Berning & Bernd Weiß, 2016. "Publication bias in the German social sciences: an application of the caliper test to three top-tier German social science journals," Quality & Quantity: International Journal of Methodology, Springer, vol. 50(2), pages 901-917, March.
    26. Stephan B. Bruns, 2017. "Meta-Regression Models and Observational Research," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 79(5), pages 637-653, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    2. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    3. Bruns, Stephan & Herwartz, Helmut & Ioannidis, John P.A. & Islam, Chris-Gabriel & Raters, Fabian H. C., 2023. "Statistical reporting errors in economics," MetaArXiv mbx62, Center for Open Science.
    4. Bruns, Stephan B. & Kalthaus, Martin, 2020. "Flexibility in the selection of patent counts: Implications for p-hacking and evidence-based policymaking," Research Policy, Elsevier, vol. 49(1).
    5. Bajzik, Josef, 2021. "Trading volume and stock returns: A meta-analysis," International Review of Financial Analysis, Elsevier, vol. 78(C).
    6. Havranek, Tomas & Bajzík, Josef & Irsova, Zuzana & Novak, Jiri, 2023. "Does Shareholder Activism Create Value? A Meta-Analysis," CEPR Discussion Papers 18233, C.E.P.R. Discussion Papers.
    7. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    8. Buehling, Kilian, 2021. "Changing research topic trends as an effect of publication rankings – The case of German economists and the Handelsblatt Ranking," Journal of Informetrics, Elsevier, vol. 15(3).
    9. Brodeur, Abel & Cook, Nikolai & Neisser, Carina, 2022. "P-Hacking, Data Type and Data-Sharing Policy," IZA Discussion Papers 15586, Institute of Labor Economics (IZA).
    10. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    11. Bruns, Stephan B. & Ioannidis, John P.A., 2020. "Determinants of economic growth: Different time different answer?," Journal of Macroeconomics, Elsevier, vol. 63(C).
    12. Simona Malovana & Martin Hodula & Zuzana Gric & Josef Bajzik, 2022. "Borrower-Based Macroprudential Measures and Credit Growth: How Biased is the Existing Literature?," Working Papers 2022/8, Czech National Bank.
    13. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    14. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Jun 2023.
    15. Salandra, Rossella & Criscuolo, Paola & Salter, Ammon, 2021. "Directing scientists away from potentially biased publications: the role of systematic reviews in health care," Research Policy, Elsevier, vol. 50(1).
    16. Ebersberger, Bernd & Galia, Fabrice & Laursen, Keld & Salter, Ammon, 2021. "Inbound Open Innovation and Innovation Performance: A Robustness Study," Research Policy, Elsevier, vol. 50(7).
    17. Doucouliagos, Hristos & Hinz, Thomas & Zigova, Katarina, 2022. "Bias and careers: Evidence from the aid effectiveness literature," European Journal of Political Economy, Elsevier, vol. 71(C).
    18. Dominika Ehrenbergerova & Josef Bajzik, 2020. "The Effect of Monetary Policy on House Prices - How Strong is the Transmission?," Working Papers 2020/14, Czech National Bank.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Peter Pütz & Stephan B. Bruns, 2021. "The (Non‐)Significance Of Reporting Errors In Economics: Evidence From Three Top Journals," Journal of Economic Surveys, Wiley Blackwell, vol. 35(1), pages 348-373, February.
    3. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    6. Stephan B. Bruns & David I. Stern, 2019. "Lag length selection and p-hacking in Granger causality testing: prevalence and performance of meta-regression models," Empirical Economics, Springer, vol. 56(3), pages 797-830, March.
    7. Bruns, Stephan B. & Kalthaus, Martin, 2020. "Flexibility in the selection of patent counts: Implications for p-hacking and evidence-based policymaking," Research Policy, Elsevier, vol. 49(1).
    8. Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
    9. Anderson, Brian S. & Wennberg, Karl & McMullen, Jeffery S., 2019. "Editorial: Enhancing quantitative theory-testing entrepreneurship research," Journal of Business Venturing, Elsevier, vol. 34(5), pages 1-1.
    10. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Jun 2023.
    11. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    12. Brodeur, Abel & Cook, Nikolai & Neisser, Carina, 2022. "P-Hacking, Data Type and Data-Sharing Policy," IZA Discussion Papers 15586, Institute of Labor Economics (IZA).
    13. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    14. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    15. Doucouliagos, Hristos & Hinz, Thomas & Zigova, Katarina, 2022. "Bias and careers: Evidence from the aid effectiveness literature," European Journal of Political Economy, Elsevier, vol. 71(C).
    16. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    17. Zacharias Maniadis & Fabio Tufano & John A. List, 2017. "To Replicate or Not To Replicate? Exploring Reproducibility in Economics through the Lens of a Model and a Pilot Study," Economic Journal, Royal Economic Society, vol. 127(605), pages 209-235, October.
    18. Klaus E Meyer & Arjen Witteloostuijn & Sjoerd Beugelsdijk, 2017. "What’s in a p? Reassessing best practices for conducting and reporting hypothesis-testing research," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 48(5), pages 535-551, July.
    19. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    20. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.

    More about this item

    Keywords

    Reporting bias; Reporting error; Innovation; p-hacking; Publication bias; Caliper test;
    All these keywords.

    JEL classification:

    • C10 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - General
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:respol:v:48:y:2019:i:9:25. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/respol .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.