IDEAS home Printed from https://ideas.repec.org/p/jgu/wpaper/2019.html
   My bibliography  Save this paper

Addressing Validity and Generalizability Concerns in Field Experiments

Author

Listed:
  • Gerhard Riener

    (University of Düsseldorf)

  • Sebastian O. Schneider

    (Max Planck Institute)

  • Valentin Wagner

    (Johannes Gutenberg University Mainz)

Abstract

In this paper, we systematically analyze the empirical importance of standard conditions for the validity and generalizability of field experiments: the internal and external overlap and unconfoundedness conditions. We experimentally varied the degree of overlap in disjoint sub-samples from a recruitment experiment with more than 3,000 public schools, mimicking small scale field experiments. This was achieved by using different techniques for treatment assignment. We applied standard methods, such as pure randomization, and the novel minMSE treatment assignment method. This new technique should achieve improved overlap by balancing covariate dependencies and variances instead of focusing on individual mean values. We assess the relevance of the overlap condition by linking the estimation precision in the disjoint sub-samples to measures of overlap and balance in general. Unconfoundedness is addressed by using a rich set of administrative data on institution and municipality characteristics to study potential self-selection. We find no evidence for the violation of unconfoundedness and establish that improved overlap, and balancedness, as achieved by the minMSE method, reduce the bias of the treatment effect estimation by more than 35% compared to pure randomization, illustrating the importance of, and suggesting a solution to, addressing overlap also in (field) experiments.

Suggested Citation

  • Gerhard Riener & Sebastian O. Schneider & Valentin Wagner, 2020. "Addressing Validity and Generalizability Concerns in Field Experiments," Working Papers 2019, Gutenberg School of Management and Economics, Johannes Gutenberg-Universität Mainz.
  • Handle: RePEc:jgu:wpaper:2019
    as

    Download full text from publisher

    File URL: https://download.uni-mainz.de/RePEc/pdf/Discussion_Paper_2019.pdf
    File Function: First version, 2020
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Joseph G. Altonji & Todd E. Elder & Christopher R. Taber, 2005. "Selection on Observed and Unobserved Variables: Assessing the Effectiveness of Catholic Schools," Journal of Political Economy, University of Chicago Press, vol. 113(1), pages 151-184, February.
    2. Asako Ohinata & Jan C. van Ours, 2013. "How Immigrant Children Affect the Academic Achievement of Native Dutch Children," Economic Journal, Royal Economic Society, vol. 0, pages 308-331, August.
    3. Francesco Avvisati & Marc Gurgand & Nina Guyon & Eric Maurin, 2014. "Getting Parents Involved: A Field Experiment in Deprived Schools," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 81(1), pages 57-83.
    4. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    5. Johannes Abeler & Daniele Nosenzo, 2015. "Self-selection into laboratory experiments: pro-social motives versus monetary incentives," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 195-214, June.
    6. Edward P. Lazear & Ulrike Malmendier & Roberto A. Weber, 2012. "Sorting in Experiments with Application to Social Preferences," American Economic Journal: Applied Economics, American Economic Association, vol. 4(1), pages 136-163, January.
    7. Omar Al-Ubaydli & John List & Dana Suskind, 2019. "The science of using science: Towards an understanding of the threats to scaling experiments," Artefactual Field Experiments 00670, The Field Experiments Website.
    8. Kraft, Matthew A. & Rogers, Todd, 2015. "The underutilized potential of teacher-to-parent communication: Evidence from a field experiment," Economics of Education Review, Elsevier, vol. 47(C), pages 49-63.
    9. Jonathan Schulz & Uwe Sunde & Petra Thiemann & Christian Thoeni, 2019. "Selection into Experiments: Evidence from a Population of Students," Discussion Papers 2019-09, The Centre for Decision Research and Experimental Economics, School of Economics, University of Nottingham.
    10. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    11. Christoph Rothe, 2017. "Robust Confidence Intervals for Average Treatment Effects Under Limited Overlap," Econometrica, Econometric Society, vol. 85, pages 645-660, March.
    12. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    13. Ohinata, Asako & van Ours, Jan C., 2011. "How Immigrant Children Affect the Academic Achievement of Native Dutch Children," IZA Discussion Papers 6212, Institute of Labor Economics (IZA).
    14. Fischer, Mira & Wagner, Valentin, 2019. "Effects of Timing and Reference Frame of Feedback," Rationality and Competition Discussion Paper Series 150, CRC TRR 190 Rationality and Competition.
    15. Kasy, Maximilian, 2016. "Why Experimenters Might Not Always Want to Randomize, and What They Could Do Instead," Political Analysis, Cambridge University Press, vol. 24(3), pages 324-338, July.
    16. Gerhard Riener & Valentin Wagner, 2019. "On the design of non-monetary incentives in schools," Education Economics, Taylor & Francis Journals, vol. 27(3), pages 223-240, May.
    17. Fischer, Mira & Wagner, Valentin, 2018. "Effects of timing and reference frame of feedback: Evidence from a field experiment," Discussion Papers, Research Unit: Market Behavior SP II 2018-206, WZB Berlin Social Science Center.
    18. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    19. Wagner, Valentin & Riener, Gerhard, 2015. "Peers or parents? On non-monetary incentives in schools," DICE Discussion Papers 203, Heinrich Heine University Düsseldorf, Düsseldorf Institute for Competition Economics (DICE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Damgaard, Mette Trier & Nielsen, Helena Skyt, 2018. "Nudging in education," Economics of Education Review, Elsevier, vol. 64(C), pages 313-342.
    2. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. Thiemann, Petra & Schulz, Jonathan & Sunde, Uwe & Thöni, Christian, 2022. "Selection into experiments: New evidence on the role of preferences, cognition, and recruitment protocols," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 98(C).
    5. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    6. Abhijit Banerjee & Sylvain Chassang & Erik Snowberg, 2016. "Decision Theoretic Approaches to Experiment Design and External Validity," NBER Working Papers 22167, National Bureau of Economic Research, Inc.
    7. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    8. Jason Fletcher & Jinho Kim & Jenna Nobles & Stephen Ross & Irina Shaorshadze, 2021. "The Effects of Foreign-Born Peers in US High Schools and Middle Schools," Journal of Human Capital, University of Chicago Press, vol. 15(3), pages 432-468.
    9. John List, 2020. "Non est Disputandum de Generalizability? A Glimpse into The External Validity Trial," Artefactual Field Experiments 00711, The Field Experiments Website.
    10. Cattaneo, Maria Alejandra & Wolter, Stefan C., 2012. "Migration Policy Can Boost PISA Results: Findings from a Natural Experiment," IZA Discussion Papers 6300, Institute of Labor Economics (IZA).
    11. Girum Abebe & A Stefano Caria & Marcel Fafchamps & Paolo Falco & Simon Franklin & Simon Quinn, 2021. "Anonymity or Distance? Job Search and Labour Market Exclusion in a Growing African City [Endogenous Stratification in Randomized Experiments]," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 88(3), pages 1279-1310.
    12. Escarce, José J. & Rocco, Lorenzo, 2018. "Immigration and the Health of Older Natives in Western Europe," GLO Discussion Paper Series 228, Global Labor Organization (GLO).
    13. Ohinata, A. & van Ours, J.C., 2013. "Spillover Effects of Studying with Immigrant Students : A Quantile Regression Approach," Discussion Paper 2013-058, Tilburg University, Center for Economic Research.
    14. Björn NILSSON, 2019. "Education and migration: insights for policymakers," Working Paper 23ca9c54-061a-4d60-967c-f, Agence française de développement.
    15. David Figlio & Umut Özek, 2019. "Unwelcome Guests? The Effects of Refugees on the Educational Outcomes of Incumbent Students," Journal of Labor Economics, University of Chicago Press, vol. 37(4), pages 1061-1096.
    16. Agostinelli, Francesco & Avitabile, Ciro & Bobba, Matteo, 2021. "Enhancing Human Capital in Children: A Case Study on Scaling," TSE Working Papers 21-1196, Toulouse School of Economics (TSE), revised Oct 2023.
    17. Escarce, José J. & Rocco, Lorenzo, 2021. "Effect of immigration on depression among older natives in Western Europe," The Journal of the Economics of Ageing, Elsevier, vol. 20(C).
    18. Joana Elisa Maldonado & Kristof De Witte & Koen Declercq, 2022. "The effects of parental involvement in homework: two randomised controlled trials in financial education," Empirical Economics, Springer, vol. 62(3), pages 1439-1464, March.
    19. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials [Microcredit impacts: Evidence from a randomized microcredit program placement experiment by Compartamos Banco]," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    20. Lars Behlen & Oliver Himmler & Robert Jäckle, 2023. "Defaults and effortful tasks," Experimental Economics, Springer;Economic Science Association, vol. 26(5), pages 1022-1059, November.

    More about this item

    Keywords

    External validity; field experiments; generalizability; treatment effect; overlap; balance; precision; treatment assignment; unconfoundedness; self-selection bias; site-selection bias;
    All these keywords.

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D04 - Microeconomics - - General - - - Microeconomic Policy: Formulation; Implementation; Evaluation

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:jgu:wpaper:2019. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Research Unit IPP (email available below). General contact details of provider: https://edirc.repec.org/data/vlmaide.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.