IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/wkmht.html
   My bibliography  Save this paper

Ulysses' Pact or Ulysses' Raft: Using Pre-Analysis Plans in Experimental and Non-Experimental Research

Author

Listed:
  • Janzen, Sarah
  • Michler, Jeffrey D

    (University of Arizona)

Abstract

In recent years, pre-analysis plans have been adopted by economists in response to concerns raised about robustness and transparency in social science research. By pre-specifying an analysis plan, researchers bind themselves and thus avoid the temptation to data mine or $p$-hack. The application of pre-analysis plans has been most widely used for randomized evaluations, particularly in the field of development economics. The increased use of pre-analysis plans has raised competing concerns that detailed plans are overly restrictive and limit the type of inspiration that only comes from exploring the data. This paper considers these competing views of pre-analysis plans, examines the extent that pre-analysis plans have been used in research conducted by agricultural economists, and discusses the usefulness of pre-analysis plans for non-experimental economic research.

Suggested Citation

  • Janzen, Sarah & Michler, Jeffrey D, 2020. "Ulysses' Pact or Ulysses' Raft: Using Pre-Analysis Plans in Experimental and Non-Experimental Research," MetaArXiv wkmht, Center for Open Science.
  • Handle: RePEc:osf:metaar:wkmht
    DOI: 10.31219/osf.io/wkmht
    as

    Download full text from publisher

    File URL: https://osf.io/download/5f21c9d83208050248152861/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/wkmht?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Jason P Brown & Dayton M Lambert & Timothy R Wojan, 2019. "The Effect of the Conservation Reserve Program on Rural Economies: Deriving a Statistical Verdict from a Null Finding," American Journal of Agricultural Economics, Agricultural and Applied Economics Association, vol. 101(2), pages 528-540.
    2. Andrew C. Chang & Phillip Li, 2017. "A Preanalysis Plan to Replicate Sixty Economics Research Papers That Worked Half of the Time," American Economic Review, American Economic Association, vol. 107(5), pages 60-64, May.
    3. Jens Rommel & Meike Weltin, 2021. "Is There a Cult of Statistical Significance in Agricultural Economics?," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(3), pages 1176-1191, September.
    4. Humphreys, Macartan & Sanchez de la Sierra, Raul & van der Windt, Peter, 2013. "Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration," Political Analysis, Cambridge University Press, vol. 21(1), pages 1-20, January.
    5. Andrew C. Chang & Phillip Li, 2018. "Measurement Error In Macroeconomic Data And Economics Research: Data Revisions, Gross Domestic Product, And Gross Domestic Income," Economic Inquiry, Western Economic Association International, vol. 56(3), pages 1846-1869, July.
    6. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, Oxford University Press, vol. 127(4), pages 1755-1812.
    7. Monogan, James E., 2013. "A Case for Registering Studies of Political Outcomes: An Application in the 2010 House Elections," Political Analysis, Cambridge University Press, vol. 21(1), pages 21-37, January.
    8. Jeffrey D. Michler & Anna Josephson & Talip Kilic & Siobhan Murray, 2020. "Estimating the Impact of Weather on Agriculture," Papers 2012.11768, arXiv.org, revised Oct 2021.
    9. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    10. Amy Finkelstein & Sarah Taubman & Bill Wright & Mira Bernstein & Jonathan Gruber & Joseph P. Newhouse & Heidi Allen & Katherine Baicker, 2012. "The Oregon Health Insurance Experiment: Evidence from the First Year," The Quarterly Journal of Economics, Oxford University Press, vol. 127(3), pages 1057-1106.
    11. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    12. Steffen Anderson & Glenn Harrison & Morten Lau & Rutstrom Elisabet, 2007. "Valuation using multiple price list formats," Applied Economics, Taylor & Francis Journals, vol. 39(6), pages 675-682.
    13. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    14. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    15. Abhijit Banerjee & Esther Duflo & Amy Finkelstein & Lawrence F. Katz & Benjamin A. Olken & Anja Sautmann, 2020. "In Praise of Moderation: Suggestions for the Scope and Use of Pre-Analysis Plans for RCTs in Economics," NBER Working Papers 26993, National Bureau of Economic Research, Inc.
    16. Josephson, Anna & Smale, Melinda, 2020. "What do you mean by ‘informed consent’? Ethics in economic development research," MetaArXiv py654, Center for Open Science.
    17. Gelman, Andrew, 2013. "Preregistration of Studies and Mock Reports," Political Analysis, Cambridge University Press, vol. 21(1), pages 40-41, January.
    18. Jeffrey D. Michler & William A. Masters & Anna Josephson, 2021. "Research ethics beyond the IRB: Selection bias and the direction of innovation in applied economics," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1352-1365, December.
    19. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    20. Travis J. Lybbert & Steven T. Buccola, 2021. "The evolving ethics of analysis, publication, and transparency in applied economics," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1330-1351, December.
    21. Laitin, David D., 2013. "Fisheries Management," Political Analysis, Cambridge University Press, vol. 21(1), pages 42-47, January.
    22. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    23. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    24. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    25. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    26. Fafchamps, Marcel & Labonne, Julien, 2017. "Using Split Samples to Improve Inference on Causal Effects," Political Analysis, Cambridge University Press, vol. 25(4), pages 465-482, October.
    27. Chang, Andrew C., 2018. "A replication recipe: List your ingredients before you start cooking," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-8.
    28. Andrew C. Chang & Phillip Li, 2015. "Is Economics Research Replicable? Sixty Published Papers from Thirteen Journals Say \"Usually Not\"," Finance and Economics Discussion Series 2015-83, Board of Governors of the Federal Reserve System (U.S.).
    29. Sarah Janzen & Nicholas Magnan & Conner Mullally & Soye Shin & I. Bailey Palmer & Judith Oduol & Karl Hughes, 2021. "Can Experiential Games and Improved Risk Coverage Raise Demand for Index Insurance? Evidence from Kenya," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(1), pages 338-361, January.
    30. Michael L. Anderson & Jeremy Magruder, 2017. "Split-Sample Strategies for Avoiding False Discoveries," NBER Working Papers 23544, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Michler, Jeffrey D. & Josephson, Anna & Kilic, Talip & Murray, Siobhan, 2022. "Privacy protection, measurement error, and the integration of remote sensing and socioeconomic survey data," Journal of Development Economics, Elsevier, vol. 158(C).
    2. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    3. Brüderle, Mirjam Anna & Peters, Jörg & Roberts, Gareth, 2022. "Weather and crime: Cautious evidence from South Africa," Ruhr Economic Papers 940, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    4. Clemens, Jeffrey & Strain, Michael R., 2021. "The Heterogeneous Effects of Large and Small Minimum Wage Changes: Evidence over the Short and Medium Run Using a Pre-analysis Plan," IZA Discussion Papers 14747, Institute of Labor Economics (IZA).
    5. Andrew C. Chang & Linda R. Cohen & Amihai Glazer & Urbashee Paul, 2021. "Politicians Avoid Tax Increases Around Elections," Finance and Economics Discussion Series 2021-004, Board of Governors of the Federal Reserve System (U.S.).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    2. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    3. Abel Brodeur, Nikolai M. Cook, Jonathan S. Hartley, Anthony Heyes, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," LCERPA Working Papers am0132, Laurier Centre for Economic Research and Policy Analysis.
    4. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    5. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    6. Christensen, Garret & Miguel, Edward & Sturdy, Jennifer, 2017. "Transparency, Reproducibility, and the Credibility of Economics Research," MetaArXiv 9a3rw, Center for Open Science.
    7. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.
    8. Bogdanoski, Aleksandar & Ofosu, George & Posner, Daniel N, 2019. "Pre-analysis Plans: A Stocktaking," MetaArXiv e4pum, Center for Open Science.
    9. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    10. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    11. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    12. Heckelei, Thomas & Huettel, Silke & Odening, Martin & Rommel, Jens, 2021. "The replicability crisis and the p-value debate – what are the consequences for the agricultural and food economics community?," Discussion Papers 316369, University of Bonn, Institute for Food and Resource Economics.
    13. Dominika Ehrenbergerova & Josef Bajzik & Tomas Havranek, 2023. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(2), pages 538-573, June.
    14. Lucas C. Coffman & Muriel Niederle, 2015. "Pre-analysis Plans Have Limited Upside, Especially Where Replications Are Feasible," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 81-98, Summer.
    15. Muhammad Haseeb & Kate Vyborny, 2016. "Imposing institutions: Evidence from cash transfer reform in Pakistan," CSAE Working Paper Series 2016-36, Centre for the Study of African Economies, University of Oxford.
    16. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    17. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    18. Opoku-Agyemang, Kweku A., 2017. "A Human-Computer Interaction Approach for Integrity in Economics," SocArXiv ra3cs, Center for Open Science.
    19. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    20. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.

    More about this item

    JEL classification:

    • A14 - General Economics and Teaching - - General Economics - - - Sociology of Economics
    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • O10 - Economic Development, Innovation, Technological Change, and Growth - - Economic Development - - - General
    • Q00 - Agricultural and Natural Resource Economics; Environmental and Ecological Economics - - General - - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:wkmht. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://osf.io/preprints/metaarxiv .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.