IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/24834.html
   My bibliography  Save this paper

How to Examine External Validity Within an Experiment

Author

Listed:
  • Amanda E. Kowalski

Abstract

A fundamental concern for researchers who analyze and design experiments is that the estimate obtained from the experiment might not be externally valid for other policies of interest. Researchers often attempt to assess external validity by comparing data from an experiment to external data. In this paper, I discuss approaches from the treatment effects literature that researchers can use to begin the examination of external validity internally, within the data from a single experiment. I focus on presenting the approaches simply using stylized examples.

Suggested Citation

  • Amanda E. Kowalski, 2018. "How to Examine External Validity Within an Experiment," NBER Working Papers 24834, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:24834
    Note: DEV EH LS PE TWP
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w24834.pdf
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Tarek Azzam & Michael Bates & David Fairris, 2019. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202002, University of California at Riverside, Department of Economics.
    2. Andrew Dustan & Juan Manuel Hernandez-Agramonte & Stanislao Maldonado, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale," Natural Field Experiments 00664, The Field Experiments Website.
    3. Black, Dan A. & Joo, Joonhwi & LaLonde, Robert & Smith, Jeffrey A. & Taylor, Evan J., 2022. "Simple Tests for Selection: Learning More from Instrumental Variables," Labour Economics, Elsevier, vol. 79(C).
    4. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    5. Azzam, Tarek & Bates, Michael D. & Fairris, David, 2022. "Do learning communities increase first year college retention? Evidence from a randomized control trial," Economics of Education Review, Elsevier, vol. 89(C).
    6. Dustan, Andrew & Hernandez-Agramonte, Juan Manuel & Maldonado, Stanislao, 2023. "Motivating bureaucrats with behavioral insights when state capacity is weak: Evidence from large-scale field experiments in Peru," Journal of Development Economics, Elsevier, vol. 160(C).
    7. Amanda E Kowalski, 2023. "Behaviour within a Clinical Trial and Implications for Mammography Guidelines," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 90(1), pages 432-462.
    8. Alexander Ruder, 2019. "What Works at Scale? A Framework to Scale Up Workforce Development Programs," FRB Atlanta Community and Economic Development Discussion Paper 2019-1, Federal Reserve Bank of Atlanta.
    9. Paul Hunermund & Elias Bareinboim, 2019. "Causal Inference and Data Fusion in Econometrics," Papers 1912.09104, arXiv.org, revised Mar 2023.
    10. Dustan, Andrew & Maldonado, Stanislao & Hernandez-Agramonte, Juan Manuel, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale field experiments in Peru," MPRA Paper 90952, University Library of Munich, Germany.
    11. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    12. Williams, Martin J., 2020. "Beyond ‘context matters’: Context and external validity in impact evaluation," World Development, Elsevier, vol. 127(C).
    13. Stephen Coussens & Jann Spiess, 2021. "Improving Inference from Simple Instruments through Compliance Estimation," Papers 2108.03726, arXiv.org.
    14. Yu-Chang Chen & Haitian Xie, 2022. "Personalized Subsidy Rules," Papers 2202.13545, arXiv.org, revised Mar 2022.

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • H0 - Public Economics - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:24834. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.