IDEAS home Printed from https://ideas.repec.org/a/cup/polals/v24y2016i03p324-338_01.html
   My bibliography  Save this article

Why Experimenters Might Not Always Want to Randomize, and What They Could Do Instead

Author

Listed:
  • Kasy, Maximilian

Abstract

Suppose that an experimenter has collected a sample as well as baseline information about the units in the sample. How should she allocate treatments to the units in this sample? We argue that the answer does not involve randomization if we think of experimental design as a statistical decision problem. If, for instance, the experimenter is interested in estimating the average treatment effect and evaluates an estimate in terms of the squared error, then she should minimize the expected mean squared error (MSE) through choice of a treatment assignment. We provide explicit expressions for the expected MSE that lead to easily implementable procedures for experimental design.

Suggested Citation

  • Kasy, Maximilian, 2016. "Why Experimenters Might Not Always Want to Randomize, and What They Could Do Instead," Political Analysis, Cambridge University Press, vol. 24(3), pages 324-338, July.
  • Handle: RePEc:cup:polals:v:24:y:2016:i:03:p:324-338_01
    as

    Download full text from publisher

    File URL: https://www.cambridge.org/core/product/identifier/S1047198700014066/type/journal_article
    File Function: link to article abstract page
    Download Restriction: no
    ---><---

    Other versions of this item:

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    2. Timothy B. Armstrong & Shu Shen, 2013. "Inference on Optimal Treatment Assignments," Cowles Foundation Discussion Papers 1927R, Cowles Foundation for Research in Economics, Yale University, revised Apr 2014.
    3. Sven Resnjanskij & Jens Ruhose & Simon Wiederhold & Ludger Woessmann & Katharina Wedel, 2024. "Can Mentoring Alleviate Family Disadvantage in Adolescence? A Field Experiment to Improve Labor Market Prospects," Journal of Political Economy, University of Chicago Press, vol. 132(3), pages 1013-1062.
    4. Abhijit Banerjee & Sylvain Chassang & Erik Snowberg, 2016. "Decision Theoretic Approaches to Experiment Design and External Validity," NBER Working Papers 22167, National Bureau of Economic Research, Inc.
    5. Gerhard Riener & Sebastian Schneider & Valentin Wagner, 2020. "Addressing Validity and Generalizability Concerns in Field Experiments," Discussion Paper Series of the Max Planck Institute for Research on Collective Goods 2020_16, Max Planck Institute for Research on Collective Goods.
    6. Davide Viviano, 2020. "Experimental Design under Network Interference," Papers 2003.08421, arXiv.org, revised Jul 2022.
    7. Aufenanger, Tobias, 2017. "Machine learning to improve experimental design," FAU Discussion Papers in Economics 16/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2017.
    8. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    9. Yusuke Narita, 2018. "Experiment-as-Market: Incorporating Welfare into Randomized Controlled Trials," Cowles Foundation Discussion Papers 2127r, Cowles Foundation for Research in Economics, Yale University, revised May 2019.
    10. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    11. Max Tabord-Meehan, 2018. "Stratification Trees for Adaptive Randomization in Randomized Controlled Trials," Papers 1806.05127, arXiv.org, revised Jul 2022.
    12. Moshe Justman, 2016. "Economic Research and Education Policy: Project STAR and Class Size Reduction," Melbourne Institute Working Paper Series wp2016n37, Melbourne Institute of Applied Economic and Social Research, The University of Melbourne.
    13. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    14. Bai, Yuehao, 2023. "Why randomize? Minimax optimality under permutation invariance," Journal of Econometrics, Elsevier, vol. 232(2), pages 565-575.
    15. Roland G. Fryer, Jr, 2013. "Information and Student Achievement: Evidence from a Cellular Phone Experiment," NBER Working Papers 19113, National Bureau of Economic Research, Inc.
    16. Max Cytrynbaum, 2021. "Optimal Stratification of Survey Experiments," Papers 2111.08157, arXiv.org, revised Aug 2023.
    17. Yiping Lu & Jiajin Li & Lexing Ying & Jose Blanchet, 2022. "Synthetic Principal Component Design: Fast Covariate Balancing with Synthetic Controls," Papers 2211.15241, arXiv.org.
    18. Alfredo Di Tillio & Marco Ottaviani & Peter Norman Sørensen, 2021. "Strategic Sample Selection," Econometrica, Econometric Society, vol. 89(2), pages 911-953, March.
    19. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    20. Pedro Carneiro & Sokbae (Simon) Lee & Daniel Wilhelm, 2017. "Optimal data collection for randomized control trials," CeMMAP working papers 45/17, Institute for Fiscal Studies.
    21. Mogues, Tewodaj & Van Campenhout, Bjorn & Miehe, Caroline & Kabunga, Nassul, 2023. "The impact of community-based monitoring on public service delivery: A randomized control trial in Uganda," World Development, Elsevier, vol. 172(C).
    22. Abhijit Banerjee & Sylvain Chassang & Sergio Montero & Erik Snowberg, 2017. "A Theory of Experimenters," CESifo Working Paper Series 6678, CESifo.
    23. Fryer, Roland G., 2016. "Information, non-financial incentives, and student achievement: Evidence from a text messaging experiment," Journal of Public Economics, Elsevier, vol. 144(C), pages 109-121.
    24. Esposito Acosta,Bruno Nicola & Sautmann,Anja, 2022. "Adaptive Experiments for Policy Choice : Phone Calls for Home Reading in Kenya," Policy Research Working Paper Series 10098, The World Bank.
    25. Aufenanger, Tobias, 2018. "Treatment allocation for linear models," FAU Discussion Papers in Economics 14/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2018.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cup:polals:v:24:y:2016:i:03:p:324-338_01. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Kirk Stebbing (email available below). General contact details of provider: https://www.cambridge.org/pan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.