IDEAS home Printed from https://ideas.repec.org/p/zbw/iwqwdp/142017.html
   My bibliography  Save this paper

Treatment allocation for linear models

Author

Listed:
  • Aufenanger, Tobias

Abstract

Methods of systematically balanced treatment allocation for economic experiments, as an alternative to random allocation, are gaining increasing attention in recent years. This paper analyzes the benefits and the limits of a systematic allocation of treatments within a linear model framework. Linear models do not necessarily require the treatment allocation to be random. Since the variance of the treatment estimator within linear models does not depend on the realization of the dependent variable, whenever the covariate information is available prior to allocating treatments it is possible to allocate treatments in a way that minimizes the variance of the treatment estimator. I show that in each experiment satisfying the linear model assumptions, there exists at least one deterministic optimal design, i.e., a deterministic way of allocating treatments that minimizes the variance of the treatment estimator over all alternative ways of allocating treatments. In finite samples, optimal design reduces the variance of the treatment estimator and increases statistical power compared to random allocation. For a given linear model with m covariates, optimal design reduces the required sample size of the experiment to achieve a predefined power by approximately m. However, asymptotically, as the sample size goes to infinity, neither optimal design nor any alternative design yields any benefit over random allocation.

Suggested Citation

  • Aufenanger, Tobias, 2018. "Treatment allocation for linear models," FAU Discussion Papers in Economics 14/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2018.
  • Handle: RePEc:zbw:iwqwdp:142017
    as

    Download full text from publisher

    File URL: https://www.econstor.eu/bitstream/10419/179521/1/14-2017-2.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Uwe Saint-Mont, 2015. "Randomization Does Not Help Much, Comparability Does," PLOS ONE, Public Library of Science, vol. 10(7), pages 1-24, July.
    2. Jinyong Hahn & Keisuke Hirano & Dean Karlan, 2011. "Adaptive Experimental Design Using the Propensity Score," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 29(1), pages 96-108, January.
    3. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    4. Lu, Bo & Greevy, Robert & Xu, Xinyi & Beck, Cole, 2011. "Optimal Nonbipartite Matching and Its Statistical Applications," The American Statistician, American Statistical Association, vol. 65(1), pages 21-30.
    5. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    6. Manning, Willard G, et al, 1987. "Health Insurance and the Demand for Medical Care: Evidence from a Randomized Experiment," American Economic Review, American Economic Association, vol. 77(3), pages 251-277, June.
    7. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    8. Potters, Jan & Stoop, Jan, 2016. "Do cheaters in the lab also cheat in the field?," European Economic Review, Elsevier, vol. 87(C), pages 26-33.
    9. John List & Sally Sadoff & Mathis Wagner, 2011. "So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design," Experimental Economics, Springer;Economic Science Association, vol. 14(4), pages 439-457, November.
    10. Kasy, Maximilian, 2016. "Why Experimenters Might Not Always Want to Randomize, and What They Could Do Instead," Political Analysis, Cambridge University Press, vol. 24(3), pages 324-338, July.
    11. Gary Kochenberger & Jin-Kao Hao & Fred Glover & Mark Lewis & Zhipeng Lü & Haibo Wang & Yang Wang, 2014. "The unconstrained binary quadratic programming problem: a survey," Journal of Combinatorial Optimization, Springer, vol. 28(1), pages 58-81, July.
    12. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    13. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    14. Angus Deaton & Nancy Cartwright, 2016. "Understanding and Misunderstanding Randomized Controlled Trials," Working Papers august_25.pdf, Princeton University, Woodrow Wilson School of Public and International Affairs, Research Program in Development Studies..
    15. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    16. Aigner, Dennis J & Balestra, Pietro, 1988. "Optimal Experimental Design for Error Components Models," Econometrica, Econometric Society, vol. 56(4), pages 955-971, July.
    17. Ziliak, Stephen T., 2014. "Balanced versus Randomized Field Experiments in Economics: Why W. S. Gosset aka "Student" Matters," Review of Behavioral Economics, now publishers, vol. 1(1-2), pages 167-208, January.
    18. Aigner, Dennis J., 1979. "A brief introduction to the methodology of optimal experimental design," Journal of Econometrics, Elsevier, vol. 11(1), pages 7-26, September.
    19. Lucifora, Claudio & Tonello, Marco, 2015. "Cheating and social interactions. Evidence from a randomized experiment in a national evaluation program," Journal of Economic Behavior & Organization, Elsevier, vol. 115(C), pages 45-66.
    20. Abhijit Banerjee & Sylvain Chassang & Erik Snowberg, 2016. "Decision Theoretic Approaches to Experiment Design and External Validity," NBER Working Papers 22167, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Aufenanger, Tobias, 2017. "Machine learning to improve experimental design," FAU Discussion Papers in Economics 16/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2017.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    3. Justman, Moshe, 2018. "Randomized controlled trials informing public policy: Lessons from project STAR and class size reduction," European Journal of Political Economy, Elsevier, vol. 54(C), pages 167-174.
    4. Moshe Justman, 2016. "Economic Research and Education Policy: Project STAR and Class Size Reduction," Melbourne Institute Working Paper Series wp2016n37, Melbourne Institute of Applied Economic and Social Research, The University of Melbourne.
    5. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    6. Pedro Carneiro & Sokbae (Simon) Lee & Daniel Wilhelm, 2017. "Optimal data collection for randomized control trials," CeMMAP working papers 45/17, Institute for Fiscal Studies.
    7. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    8. Karthik Muralidharan & Mauricio Romero & Kaspar Wüthrich, 2019. "Factorial Designs, Model Selection, and (Incorrect) Inference in Randomized Experiments," NBER Working Papers 26562, National Bureau of Economic Research, Inc.
    9. Max Tabord-Meehan, 2018. "Stratification Trees for Adaptive Randomization in Randomized Controlled Trials," Papers 1806.05127, arXiv.org, revised Jul 2022.
    10. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    11. Susan Athey & Guido Imbens, 2016. "The Econometrics of Randomized Experiments," Papers 1607.00698, arXiv.org.
    12. Aufenanger, Tobias, 2017. "Machine learning to improve experimental design," FAU Discussion Papers in Economics 16/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2017.
    13. Martin, Will, 2021. "Tools for measuring the full impacts of agricultural interventions," IFPRI-MCC technical papers 2, International Food Policy Research Institute (IFPRI).
    14. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    15. Yusuke Narita, 2018. "Experiment-as-Market: Incorporating Welfare into Randomized Controlled Trials," Cowles Foundation Discussion Papers 2127r, Cowles Foundation for Research in Economics, Yale University, revised May 2019.
    16. Bravo-Ureta, Boris E. & Higgins, Daniel & Arslan, Aslihan, 2020. "Irrigation infrastructure and farm productivity in the Philippines: A stochastic Meta-Frontier analysis," World Development, Elsevier, vol. 135(C).
    17. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    18. Víctor Casero-Alonso & Jesús López-Fidalgo, 2015. "Experimental designs in triangular simultaneous equations models," Statistical Papers, Springer, vol. 56(2), pages 273-290, May.
    19. Lota Tamini & Ibrahima Bocoum & Ghislain Auger & Kotchikpa Gabriel Lawin & Arahama Traoré, 2019. "Enhanced Microfinance Services and Agricultural Best Management Practices: What Benefits for Smallholders Farmers? An Evidence from Burkina Faso," CIRANO Working Papers 2019s-11, CIRANO.
    20. Pedro Carneiro & Sokbae (Simon) Lee & Daniel Wilhelm, 2016. "Optimal data collection for randomized control trials," CeMMAP working papers 15/16, Institute for Fiscal Studies.

    More about this item

    Keywords

    experiment design; treatment allocation;

    JEL classification:

    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C61 - Mathematical and Quantitative Methods - - Mathematical Methods; Programming Models; Mathematical and Simulation Modeling - - - Optimization Techniques; Programming Models; Dynamic Analysis

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zbw:iwqwdp:142017. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ZBW - Leibniz Information Centre for Economics (email available below). General contact details of provider: https://edirc.repec.org/data/vierlde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.