IDEAS home Printed from https://ideas.repec.org/p/fip/fedmsr/89864.html
   My bibliography  Save this paper

Star Wars at Central Banks

Author

Listed:
  • Joel Bank
  • Hamish Fitchett
  • Adam Gorajek
  • Benjamin A. Malin
  • Andrew Staib

Abstract

We investigate the credibility of central bank research by searching for traces of researcher bias, which is a tendency to use undisclosed analytical procedures that raise measured levels of statistical significance (stars) in artificial ways. To conduct our search, we compile a new dataset and borrow 2 bias-detection methods from the literature: the p-curve and z-curve. The results are mixed. The p-curve shows no traces of researcher bias but has a high propensity to produce false negatives. The z-curve shows some traces of researcher bias but requires strong assumptions. We examine those assumptions and challenge their merit. At this point, all that is clear is that central banks produce results with patterns different from those in top economic journals, there being less bunching around the 5 per cent threshold of statistical significance.

Suggested Citation

  • Joel Bank & Hamish Fitchett & Adam Gorajek & Benjamin A. Malin & Andrew Staib, 2021. "Star Wars at Central Banks," Staff Report 620, Federal Reserve Bank of Minneapolis.
  • Handle: RePEc:fip:fedmsr:89864
    DOI: 10.21034/sr.620
    as

    Download full text from publisher

    File URL: https://www.minneapolisfed.org/research/sr/sr620.pdf
    Download Restriction: no

    File URL: https://libkey.io/10.21034/sr.620?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    2. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    3. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    4. Andrew Gelman & Guido Imbens, 2013. "Why ask Why? Forward Causal Inference and Reverse Causal Questions," NBER Working Papers 19614, National Bureau of Economic Research, Inc.
    5. Leeb, Hannes & Pötscher, Benedikt M., 2005. "Model Selection And Inference: Facts And Fiction," Econometric Theory, Cambridge University Press, vol. 21(1), pages 21-59, February.
    6. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working Papers 2020-128, Becker Friedman Institute for Research In Economics.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jakub Rybacki & Dobromił Serwa, 2021. "What Makes a Successful Scientist in a Central Bank? Evidence From the RePEc Database," Central European Journal of Economic Modelling and Econometrics, Central European Journal of Economic Modelling and Econometrics, vol. 13(3), pages 331-357, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Adam Gorajek & Benjamin A. Malin, 2021. "Comment on "Star Wars: The Empirics Strike Back"," Staff Report 629, Federal Reserve Bank of Minneapolis.
    2. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    3. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    4. Uwe Hassler & Marc‐Oliver Pohle, 2022. "Unlucky Number 13? Manipulating Evidence Subject to Snooping," International Statistical Review, International Statistical Institute, vol. 90(2), pages 397-410, August.
    5. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    6. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    7. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    8. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    9. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    10. Jakub Rybacki & Dobromił Serwa, 2021. "What Makes a Successful Scientist in a Central Bank? Evidence From the RePEc Database," Central European Journal of Economic Modelling and Econometrics, Central European Journal of Economic Modelling and Econometrics, vol. 13(3), pages 331-357, September.
    11. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell us about p-Hacking and Publication Bias in Online Experiments," GLO Discussion Paper Series 1157, Global Labor Organization (GLO).
    12. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    13. Brodeur, Abel & Esterling, Kevin & Ankel-Peters, Jörg & Bueno, Natália S & Desposato, Scott & Dreber, Anna & Genovese, Federica & Green, Donald P & Hepplewhite, Matthew & de la Guardia, Fernando Hoces, 2024. "Promoting Reproducibility and Replicability in Political Science," Department of Economics, Working Paper Series qt23n3n3dg, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    14. Roggenkamp, Hauke C., 2024. "Revisiting ‘Growth and Inequality in Public Good Provision’—Reproducing and Generalizing Through Inconvenient Online Experimentation," OSF Preprints 6rn97, Center for Open Science.
    15. Alexander L. Brown & Taisuke Imai & Ferdinand M. Vieider & Colin F. Camerer, 2024. "Meta-analysis of Empirical Estimates of Loss Aversion," Journal of Economic Literature, American Economic Association, vol. 62(2), pages 485-516, June.
    16. Carina Neisser, 2021. "The Elasticity of Taxable Income: A Meta-Regression Analysis [The top 1% in international and historical perspective]," The Economic Journal, Royal Economic Society, vol. 131(640), pages 3365-3391.
    17. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    18. Matteo Picchio & Michele Ubaldi, 2024. "Unemployment and health: A meta‐analysis," Journal of Economic Surveys, Wiley Blackwell, vol. 38(4), pages 1437-1472, September.
    19. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    20. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2024. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.

    More about this item

    Keywords

    Researcher bias; Central banks;

    JEL classification:

    • A11 - General Economics and Teaching - - General Economics - - - Role of Economics; Role of Economists
    • C13 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Estimation: General
    • E58 - Macroeconomics and Monetary Economics - - Monetary Policy, Central Banking, and the Supply of Money and Credit - - - Central Banks and Their Policies

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:fip:fedmsr:89864. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Kate Hansel (email available below). General contact details of provider: https://edirc.repec.org/data/cfrbmus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.