IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/23867.html
   My bibliography  Save this paper

A Theory of Experimenters

Author

Listed:
  • Abhijit Banerjee
  • Sylvain Chassang
  • Sergio Montero
  • Erik Snowberg

Abstract

This paper proposes a decision-theoretic framework for experiment design. We model experimenters as ambiguity-averse decision-makers, who make trade-offs between subjective expected performance and robustness. This framework accounts for experimenters' preference for randomization, and clarifies the circumstances in which randomization is optimal: when the available sample size is large enough or robustness is an important concern. We illustrate the practical value of such a framework by studying the issue of rerandomization. Rerandomization creates a trade-off between subjective performance and robustness. However, robustness loss grows very slowly with the number of times one randomizes. This argues for rerandomizing in most environments.

Suggested Citation

  • Abhijit Banerjee & Sylvain Chassang & Sergio Montero & Erik Snowberg, 2017. "A Theory of Experimenters," NBER Working Papers 23867, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:23867
    Note: DEV ED HC HE LS PE POL
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w23867.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Federico A. Bugni & Ivan A. Canay & Azeem M. Shaikh, 2018. "Inference Under Covariate-Adaptive Randomization," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(524), pages 1784-1796, October.
    2. Humphreys, Macartan & Sanchez de la Sierra, Raul & van der Windt, Peter, 2013. "Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration," Political Analysis, Cambridge University Press, vol. 21(1), pages 1-20, January.
    3. Toru Kitagawa & Aleksey Tetenov, 2018. "Who Should Be Treated? Empirical Welfare Maximization Methods for Treatment Choice," Econometrica, Econometric Society, vol. 86(2), pages 591-616, March.
    4. Tetenov, Aleksey, 2012. "Statistical treatment choice based on asymmetric minimax regret criteria," Journal of Econometrics, Elsevier, vol. 166(1), pages 157-165.
    5. Kasy, Maximilian, 2016. "Why Experimenters Might Not Always Want to Randomize, and What They Could Do Instead," Political Analysis, Cambridge University Press, vol. 24(3), pages 324-338, July.
    6. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    7. Edward Miguel & Michael Kremer, 2004. "Worms: Identifying Impacts on Education and Health in the Presence of Treatment Externalities," Econometrica, Econometric Society, vol. 72(1), pages 159-217, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Davide Viviano, 2020. "Experimental Design under Network Interference," Papers 2003.08421, arXiv.org, revised Jun 2020.
    2. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    3. Luc Behaghel & Karen Macours & Julie Subervie, 2018. "Can RCTs help improve the design of CAP," Working Papers hal-01974425, HAL.
    4. Luc Behaghel & Karen Macours & Julie Subervie, 2019. "How can randomised controlled trials help improve the design of the common agricultural policy?," European Review of Agricultural Economics, Foundation for the European Review of Agricultural Economics, vol. 46(3), pages 473-493.
    5. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Abhijit Banerjee & Sylvain Chassang & Erik Snowberg, 2016. "Decision Theoretic Approaches to Experiment Design and External Validity," NBER Working Papers 22167, National Bureau of Economic Research, Inc.
    4. Aufenanger, Tobias, 2017. "Machine learning to improve experimental design," FAU Discussion Papers in Economics 16/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics.
    5. Annette N. Brown & Drew B. Cameron & Benjamin D. K. Wood, 2014. "Quality evidence for policymaking: I'll believe it when I see the replication," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(3), pages 215-235, September.
    6. Anders Bredahl Kock & Martin Thyrsgaard, 2017. "Optimal sequential treatment allocation," Papers 1705.09952, arXiv.org, revised Aug 2018.
    7. Marshall Burke & Lauren Falcao Bergquist & Edward Miguel, 2018. "Sell Low and Buy High: Arbitrage and Local Price Effects in Kenyan Markets," NBER Working Papers 24476, National Bureau of Economic Research, Inc.
    8. Timothy B. Armstrong & Shu Shen, 2013. "Inference on Optimal Treatment Assignments," Cowles Foundation Discussion Papers 1927, Cowles Foundation for Research in Economics, Yale University.
    9. Baldwin, Kate & Bhavnani, Rikhil R., 2013. "Ancillary Experiments: Opportunities and Challenges," WIDER Working Paper Series 024, World Institute for Development Economic Research (UNU-WIDER).
    10. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    11. Sven Resnjanskij & Jens Ruhose & Simon Wiederhold & Ludger Woessmann, 2021. "Can Mentoring Alleviate Family Disadvantage in Adolscence? A Field Experiment to Improve Labor-Market Prospects," CESifo Working Paper Series 8870, CESifo.
    12. Yusuke Narita, 2018. "Experiment-as-Market: Incorporating Welfare into Randomized Controlled Trials," Cowles Foundation Discussion Papers 2127r, Cowles Foundation for Research in Economics, Yale University, revised May 2019.
    13. Bold, Tessa & Kimenyi, Mwangi & Mwabu, Germano & Ng’ang’a, Alice & Sandefur, Justin, 2018. "Experimental evidence on scaling up education reforms in Kenya," Journal of Public Economics, Elsevier, vol. 168(C), pages 1-20.
    14. Isaiah Andrews & Jesse M. Shapiro, 2020. "A Model of Scientific Communication," NBER Working Papers 26824, National Bureau of Economic Research, Inc.
    15. Wang, H. & Guan, H. & Boswell, M., 2018. "Health Seeking Behavior among Rural Left-behind Children: Evidence from a Randomized Controlled Trial in China," 2018 Conference, July 28-August 2, 2018, Vancouver, British Columbia 276955, International Association of Agricultural Economists.
    16. Senne Vandevelde & Bjorn Van Campenhout & Wilberforce Walukano, 2018. "Spoiler alert! Spillovers in the context of a video intervention to maintain seed quality among Ugandan potato farmers," Working Papers of LICOS - Centre for Institutions and Economic Performance 634335, KU Leuven, Faculty of Economics and Business (FEB), LICOS - Centre for Institutions and Economic Performance.
    17. Debopam Bhattacharya & Pascaline Dupas & Shin Kanaya, 2013. "Estimating the Impact of Means-tested Subsidies under Treatment Externalities with Application to Anti-Malarial Bednets," CREATES Research Papers 2013-06, Department of Economics and Business Economics, Aarhus University.
    18. Callen, Michael & Long, James D., 2015. "Institutional corruption and election fraud: evidence from a field experiment in Afghanistan," LSE Research Online Documents on Economics 102931, London School of Economics and Political Science, LSE Library.
    19. Aufenanger, Tobias, 2018. "Treatment allocation for linear models," FAU Discussion Papers in Economics 14/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics.
    20. Susan Athey & Stefan Wager, 2017. "Policy Learning with Observational Data," Papers 1702.02896, arXiv.org, revised Sep 2020.

    More about this item

    JEL classification:

    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • D81 - Microeconomics - - Information, Knowledge, and Uncertainty - - - Criteria for Decision-Making under Risk and Uncertainty

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:23867. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.