IDEAS home Printed from
   My bibliography  Save this paper

Why Experimenters Might Not Always Want to Randomize, and What They Could Do Instead


  • Kasy, Maximilian


This paper discusses experimental design for the case that (i) we are given a distribution of covariates from a pre-selected random sample, and (ii) we are interested in the average treatment effect (ATE) of some binary treatment. We show that in general there is a unique optimal non-random treatment assignment if there are continuous covariates. We argue that experimenters should choose this assignment. The optimal assignment minimizes the risk (e.g., expected squared error) of treatment effects estimators. We provide explicit expressions for the risk, and discuss algorithms which minimize it. The objective of controlled trials is to have treatment groups which are similar a priori (balanced), so we can ``compare apples with apples.'' The expressions for risk derived in this paper provide an operationalization of the notion of balance. The intuition for our non-randomization result is similar to the reasons for not using randomized estimators - adding noise can never decrease risk. The formal setup we consider is decision-theoretic and nonparametric. In simulations and an application to project STAR we find that optimal designs have mean squared errors of up to 20% less than randomized designs and up to 14% less than stratified designs..

Suggested Citation

  • Kasy, Maximilian, "undated". "Why Experimenters Might Not Always Want to Randomize, and What They Could Do Instead," Working Paper 36154, Harvard University OpenScholar.
  • Handle: RePEc:qsh:wpaper:36154

    Download full text from publisher

    File URL:
    Download Restriction: no

    Other versions of this item:


    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.

    Cited by:

    1. Abhijit Banerjee & Sylvain Chassang & Erik Snowberg, 2016. "Decision Theoretic Approaches to Experiment Design and External Validity," NBER Working Papers 22167, National Bureau of Economic Research, Inc.
    2. Gerhard Riener & Sebastian Schneider & Valentin Wagner, 2020. "Addressing Validity and Generalizability Concerns in Field Experiments," Discussion Paper Series of the Max Planck Institute for Research on Collective Goods 2020_16, Max Planck Institute for Research on Collective Goods.
    3. Davide Viviano, 2020. "Experimental Design under Network Interference," Papers 2003.08421,, revised Jul 2022.
    4. Sven Resnjanskij & Jens Ruhose & Simon Wiederhold & Ludger Woessmann & Katharina Wedel, 2024. "Can Mentoring Alleviate Family Disadvantage in Adolescence? A Field Experiment to Improve Labor Market Prospects," Journal of Political Economy, University of Chicago Press, vol. 132(3), pages 1013-1062.
    5. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    6. Aufenanger, Tobias, 2017. "Machine learning to improve experimental design," FAU Discussion Papers in Economics 16/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2017.
    7. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    8. Max Tabord-Meehan, 2018. "Stratification Trees for Adaptive Randomization in Randomized Controlled Trials," Papers 1806.05127,, revised Jul 2022.
    9. Roland G. Fryer, Jr, 2013. "Information and Student Achievement: Evidence from a Cellular Phone Experiment," NBER Working Papers 19113, National Bureau of Economic Research, Inc.
    10. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    11. Yiping Lu & Jiajin Li & Lexing Ying & Jose Blanchet, 2022. "Synthetic Principal Component Design: Fast Covariate Balancing with Synthetic Controls," Papers 2211.15241,
    12. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    13. Pedro Carneiro & Sokbae (Simon) Lee & Daniel Wilhelm, 2017. "Optimal data collection for randomized control trials," CeMMAP working papers 45/17, Institute for Fiscal Studies.
    14. Timothy B. Armstrong & Shu Shen, 2013. "Inference on Optimal Treatment Assignments," Cowles Foundation Discussion Papers 1927RR, Cowles Foundation for Research in Economics, Yale University, revised Apr 2015.
    15. Abhijit Banerjee & Sylvain Chassang & Sergio Montero & Erik Snowberg, 2017. "A Theory of Experimenters," CESifo Working Paper Series 6678, CESifo.
    16. Esposito Acosta,Bruno Nicola & Sautmann,Anja, 2022. "Adaptive Experiments for Policy Choice : Phone Calls for Home Reading in Kenya," Policy Research Working Paper Series 10098, The World Bank.
    17. Yusuke Narita, 2018. "Experiment-as-Market: Incorporating Welfare into Randomized Controlled Trials," Cowles Foundation Discussion Papers 2127r, Cowles Foundation for Research in Economics, Yale University, revised May 2019.
    18. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    19. Moshe Justman, 2016. "Economic Research and Education Policy: Project STAR and Class Size Reduction," Melbourne Institute Working Paper Series wp2016n37, Melbourne Institute of Applied Economic and Social Research, The University of Melbourne.
    20. Bai, Yuehao, 2023. "Why randomize? Minimax optimality under permutation invariance," Journal of Econometrics, Elsevier, vol. 232(2), pages 565-575.
    21. Max Cytrynbaum, 2021. "Optimal Stratification of Survey Experiments," Papers 2111.08157,, revised Aug 2023.
    22. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2021. "Strategic Sample Selection," Econometrica, Econometric Society, vol. 89(2), pages 911-953, March.
    23. Mogues, Tewodaj & Van Campenhout, Bjorn & Miehe, Caroline & Kabunga, Nassul, 2023. "The impact of community-based monitoring on public service delivery: A randomized control trial in Uganda," World Development, Elsevier, vol. 172(C).
    24. Fryer, Roland G., 2016. "Information, non-financial incentives, and student achievement: Evidence from a text messaging experiment," Journal of Public Economics, Elsevier, vol. 144(C), pages 109-121.
    25. Aufenanger, Tobias, 2018. "Treatment allocation for linear models," FAU Discussion Papers in Economics 14/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2018.

    More about this item


    Access and download statistics


    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:qsh:wpaper:36154. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Richard Brandon (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.