IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/21842.html
   My bibliography  Save this paper

Using Split Samples to Improve Inference about Causal Effects

Author

Listed:
  • Marcel Fafchamps
  • Julien Labonne

Abstract

We discuss a method aimed at reducing the risk that spurious results are published. Researchers send their datasets to an independent third party who randomly generates training and testing samples. Researchers perform their analysis on the former and once the paper is accepted for publication the method is applied to the latter and it is those results that are published. Simulations indicate that, under empirically relevant settings, the proposed method significantly reduces type I error and delivers adequate power. The method – that can be combined with pre-analysis plans – reduces the risk that relevant hypotheses are left untested.

Suggested Citation

  • Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:21842
    Note: DEV
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w21842.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Yoav Benjamini & Abba M. Krieger & Daniel Yekutieli, 2006. "Adaptive linear step-up procedures that control the false discovery rate," Biometrika, Biometrika Trust, vol. 93(3), pages 491-507, September.
    2. Alexandre Belloni & Victor Chernozhukov & Christian Hansen, 2014. "High-Dimensional Methods and Inference on Structural and Treatment Effects," Journal of Economic Perspectives, American Economic Association, vol. 28(2), pages 29-50, Spring.
    3. Katherine Casey & Rachel Glennerster & Edward Miguel, 2012. "Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(4), pages 1755-1812.
    4. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    5. Brendon McConnell & Marcos Vera-Hernandez, 2015. "Going beyond simple sample size calculations: a practitioner's guide," IFS Working Papers W15/17, Institute for Fiscal Studies.
    6. Angrist, Joshua D & Krueger, Alan B, 1995. "Split-Sample Instrumental Variables Estimates of the Return to Schooling," Journal of Business & Economic Statistics, American Statistical Association, vol. 13(2), pages 225-235, April.
    7. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    8. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    9. Joshua D. Angrist & Alan B. Krueger, 1993. "Split Sample Instrumental Variables," Working Papers 699, Princeton University, Department of Economics, Industrial Relations Section..
    10. Athey, Susan & Imbens, Guido W., 2015. "Machine Learning for Estimating Heterogeneous Causal Effects," Research Papers 3350, Stanford University, Graduate School of Business.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Hu, Juncheng, 2021. "Do facilitation payments affect earnings management? Evidence from China," Journal of Corporate Finance, Elsevier, vol. 68(C).
    2. Nicola Fuchs‐Schündeln & Paolo Masella & Hannah Paule‐Paludkiewicz, 2020. "Cultural Determinants of Household Saving Behavior," Journal of Money, Credit and Banking, Blackwell Publishing, vol. 52(5), pages 1035-1070, August.
    3. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    4. Bedoya Arguelles,Guadalupe & Bittarello,Luca & Davis,Jonathan Martin Villars & Mittag,Nikolas Karl & Bedoya Arguelles,Guadalupe & Bittarello,Luca & Davis,Jonathan Martin Villars & Mittag,Nikolas Karl, 2017. "Distributional impact analysis: toolkit and illustrations of impacts beyond the average treatment effect," Policy Research Working Paper Series 8139, The World Bank.
    5. Marcel Fafchamps & Julien Labonne, 2020. "Family Networks and Distributive Politics," Journal of the European Economic Association, European Economic Association, vol. 18(4), pages 1697-1725.
    6. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    7. Muhammad Haseeb & Kate Vyborny, 2016. "Imposing institutions: Evidence from cash transfer reform in Pakistan," CSAE Working Paper Series 2016-36, Centre for the Study of African Economies, University of Oxford.
    8. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fafchamps, Marcel & Labonne, Julien, 2017. "Using Split Samples to Improve Inference on Causal Effects," Political Analysis, Cambridge University Press, vol. 25(4), pages 465-482, October.
    2. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    3. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    4. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    5. Josephson, Anna & Michler, Jeffrey D., 2018. "Viewpoint: Beasts of the field? Ethics in agricultural and applied economics," Food Policy, Elsevier, vol. 79(C), pages 1-11.
    6. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    7. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    8. Lenz, Gabriel & Sahn, Alexander, 2017. "Achieving Statistical Significance with Covariates and without Transparency," MetaArXiv s42ba, Center for Open Science.
    9. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    10. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    11. Muhammad Haseeb & Kate Vyborny, 2016. "Imposing institutions: Evidence from cash transfer reform in Pakistan," CSAE Working Paper Series 2016-36, Centre for the Study of African Economies, University of Oxford.
    12. Alex Eble & Peter Boone & Diana Elbourne, 2017. "On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics," The World Bank Economic Review, World Bank, vol. 31(3), pages 687-707.
    13. Michael L. Anderson & Jeremy Magruder, 2017. "Split-Sample Strategies for Avoiding False Discoveries," NBER Working Papers 23544, National Bureau of Economic Research, Inc.
    14. Brodeur, Abel & Cook, Nikolai M. & Hartley, Jonathan S. & Heyes, Anthony, 2023. "Do Pre-Registration and Pre-Analysis Plans Reduce p-Hacking and Publication Bias?: Evidence from 15,992 Test Statistics and Suggestions for Improvement," GLO Discussion Paper Series 1147 [pre.], Global Labor Organization (GLO).
    15. Stephan B. Bruns, 2016. "The Fragility of Meta-Regression Models in Observational Research," MAGKS Papers on Economics 201603, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    16. Bogdanoski, Aleksandar & Ofosu, George & Posner, Daniel N, 2019. "Pre-analysis Plans: A Stocktaking," MetaArXiv e4pum, Center for Open Science.
    17. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    18. Davide Viviano & Jelena Bradic, 2019. "Synthetic learner: model-free inference on treatments over time," Papers 1904.01490, arXiv.org, revised Aug 2022.
    19. A. Belloni & D. Chen & V. Chernozhukov & C. Hansen, 2012. "Sparse Models and Methods for Optimal Instruments With an Application to Eminent Domain," Econometrica, Econometric Society, vol. 80(6), pages 2369-2429, November.
    20. Marco Manacorda & Andrea Tesei, 2020. "Liberation Technology: Mobile Phones and Political Mobilization in Africa," Econometrica, Econometric Society, vol. 88(2), pages 533-567, March.

    More about this item

    JEL classification:

    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C18 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Methodolical Issues: General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:21842. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.