IDEAS home Printed from https://ideas.repec.org/a/aea/jecper/v29y2015i3p81-98.html
   My bibliography  Save this article

Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible

Author

Listed:
  • Lucas C. Coffman
  • Muriel Niederle

Abstract

The social sciences—including economics—have long called for transparency in research to counter threats to producing robust and replicable results. In this paper, we discuss the pros and cons of three of the more prominent proposed approaches: pre-analysis plans, hypothesis registries, and replications. They have been primarily discussed for experimental research, both in the field including randomized control trials and the laboratory, so we focus on these areas. A pre-analysis plan is a credibly fixed plan of how a researcher will collect and analyze data, which is submitted before a project begins. Though pre-analysis plans have been lauded in the popular press and across the social sciences, we will argue that enthusiasm for pre-analysis plans should be tempered for several reasons. Hypothesis registries are a database of all projects attempted; the goal of this promising mechanism is to alleviate the "file drawer problem," which is that statistically significant results are more likely to be published, while other results are consigned to the researcher's "file drawer." Finally, we evaluate the efficacy of replications. We argue that even with modest amounts of researcher bias—either replication attempts bent on proving or disproving the published work, or poor replication attempts—replications correct even the most inaccurate beliefs within three to five replications. We offer practical proposals for how to increase the incentives for researchers to carry out replications.

Suggested Citation

  • Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
  • Handle: RePEc:aea:jecper:v:29:y:2015:i:3:p:81-98
    Note: DOI: 10.1257/jep.29.3.81
    as

    Download full text from publisher

    File URL: http://www.aeaweb.org/articles.php?doi=10.1257/jep.29.3.81
    Download Restriction: Access to full text is restricted to AEA members and institutional subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    2. Lawrence F. Katz & Jeffrey R. Kling & Jeffrey B. Liebman, 2001. "Moving to Opportunity in Boston: Early Results of a Randomized Mobility Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 116(2), pages 607-654.
    3. Roth, Alvin E, 1994. "Lets Keep the Con out of Experimental Econ.: A Methodological Note," Empirical Economics, Springer, vol. 19(2), pages 279-289.
    4. Humphreys, Macartan & Sanchez de la Sierra, Raul & van der Windt, Peter, 2013. "Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration," Political Analysis, Cambridge University Press, vol. 21(1), pages 1-20, January.
    5. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    6. Monogan, James E., 2013. "A Case for Registering Studies of Political Outcomes: An Application in the 2010 House Elections," Political Analysis, Cambridge University Press, vol. 21(1), pages 21-37, January.
    7. Guth, Werner & Schmittberger, Rolf & Schwarze, Bernd, 1982. "An experimental analysis of ultimatum bargaining," Journal of Economic Behavior & Organization, Elsevier, vol. 3(4), pages 367-388, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    2. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    3. Michał Krawczyk, 2015. "The Search for Significance: A Few Peculiarities in the Distribution of P Values in Experimental Psychology Literature," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-19, June.
    4. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    5. Bogdanoski, Aleksandar & Ofosu, George & Posner, Daniel N, 2019. "Pre-analysis Plans: A Stocktaking," MetaArXiv e4pum, Center for Open Science.
    6. Marcel Fafchamps & Julien Labonne, 2017. "Do Politicians’ Relatives Get Better Jobs? Evidence from Municipal Elections," The Journal of Law, Economics, and Organization, Oxford University Press, vol. 33(2), pages 268-300.
    7. King, Elisabeth & Samii, Cyrus, 2014. "Fast-Track Institution Building in Conflict-Affected Countries? Insights from Recent Field Experiments," World Development, Elsevier, vol. 64(C), pages 740-754.
    8. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
    9. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    10. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).
    11. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    12. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    13. Andrew C. Chang & Phillip Li, 2018. "Measurement Error In Macroeconomic Data And Economics Research: Data Revisions, Gross Domestic Product, And Gross Domestic Income," Economic Inquiry, Western Economic Association International, vol. 56(3), pages 1846-1869, July.
    14. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    15. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.
    16. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    17. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    18. Christian Leuz, 2018. "Evidence-based policymaking: promise, challenges and opportunities for accounting and financial markets research," Accounting and Business Research, Taylor & Francis Journals, vol. 48(5), pages 582-608, July.
    19. Lenz, Gabriel & Sahn, Alexander, 2017. "Achieving Statistical Significance with Covariates and without Transparency," MetaArXiv s42ba, Center for Open Science.
    20. Benjamin A. T. Graham & Jacob R. Tucker, 2019. "The international political economy data resource," The Review of International Organizations, Springer, vol. 14(1), pages 149-161, March.

    More about this item

    JEL classification:

    • C38 - Mathematical and Quantitative Methods - - Multiple or Simultaneous Equation Models; Multiple Variables - - - Classification Methdos; Cluster Analysis; Principal Components; Factor Analysis

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:aea:jecper:v:29:y:2015:i:3:p:81-98. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Michael P. Albert (email available below). General contact details of provider: https://edirc.repec.org/data/aeaaaea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.