IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/2s8w5.html
   My bibliography  Save this paper

The Necessity of Construct and External Validity for Generalized Causal Claims

Author

Listed:
  • Esterling, Kevin

    (UC Riverside)

  • Brady, David
  • Schwitzgebel, Eric

Abstract

The credibility revolution has facilitated tremendous progress in the social sciences by advancing design-based strategies that rely on internal validity to deductively identify causal effects. We demonstrate that prioritizing internal validity while neglecting construct and external validity prevents causal generalization and misleadingly converts a deductive claim of causality into a claim based on speculation and exploration -- undermining the very goals of the credibility revolution. We develop a formal framework of causal specification to demonstrate that internal, external and construct validity are jointly necessary for generalized claims regarding a causal effect. If one lacks construct validity, one cannot assign meaningful labels to the cause or to the outcome. If one lacks external validity, one cannot make statements about the conditions required for the cause to occur. Re-balancing considerations of internal, construct and external via causal specification preserves and advances the intent of the credibility revolution to understand causal effects.

Suggested Citation

  • Esterling, Kevin & Brady, David & Schwitzgebel, Eric, 2021. "The Necessity of Construct and External Validity for Generalized Causal Claims," OSF Preprints 2s8w5, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:2s8w5
    DOI: 10.31219/osf.io/2s8w5
    as

    Download full text from publisher

    File URL: https://osf.io/download/6011b774dd222501f35923a6/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/2s8w5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Keele, Luke & Minozzi, William, 2013. "How Much Is Minnesota Like Wisconsin? Assumptions and Counterfactuals in Causal Inference with Observational Data," Political Analysis, Cambridge University Press, vol. 21(2), pages 193-216, April.
    2. Guido W. Imbens, 2020. "Potential Outcome and Directed Acyclic Graph Approaches to Causality: Relevance for Empirical Practice in Economics," Journal of Economic Literature, American Economic Association, vol. 58(4), pages 1129-1179, December.
    3. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    4. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    5. Robert B. Olsen & Larry L. Orr & Stephen H. Bell & Elizabeth A. Stuart, 2013. "External Validity in Policy Evaluations That Choose Sites Purposively," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 32(1), pages 107-121, January.
    6. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    7. Andrew Gelman & Guido Imbens, 2013. "Why ask Why? Forward Causal Inference and Reverse Causal Questions," NBER Working Papers 19614, National Bureau of Economic Research, Inc.
    8. Dafoe, Allan & Zhang, Baobao & Caughey, Devin, 2018. "Information Equivalence in Survey Experiments," Political Analysis, Cambridge University Press, vol. 26(4), pages 399-416, October.
    9. Thomas D. Cook, 2014. "Generalizing Causal Knowledge In The Policy Sciences: External Validity As A Task Of Both Multiattribute Representation And Multiattribute Extrapolation," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 33(2), pages 527-536, March.
    10. Joseph Henrich & Steve J. Heine & Ara Norenzayan, 2010. "The Weirdest People in the World?," RatSWD Working Papers 139, German Data Forum (RatSWD).
    11. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    12. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    13. Gerber, Alan S. & Green, Donald P. & Larimer, Christopher W., 2008. "Social Pressure and Voter Turnout: Evidence from a Large-Scale Field Experiment," American Political Science Review, Cambridge University Press, vol. 102(1), pages 33-48, February.
    14. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    15. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    16. Seán M. Muller, 2015. "Causal Interaction and External Validity: Obstacles to the Policy Relevance of Randomized Evaluations," The World Bank Economic Review, World Bank, vol. 29(suppl_1), pages 217-225.
    17. Maureen A. Pirog, 2014. "Internal Versus External Validity: Where Are Policy Analysts Going?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 33(2), pages 548-550, March.
    18. Humphreys, Macartan & Scacco, Alexandra, 2020. "The aggregation challenge," World Development, Elsevier, vol. 127(C).
    19. Adcock, Robert & Collier, David, 2001. "Measurement Validity: A Shared Standard for Qualitative and Quantitative Research," American Political Science Review, Cambridge University Press, vol. 95(3), pages 529-546, September.
    20. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    21. Humphreys, Macartan & Scacco, Alexandra, 2020. "The aggregation challenge," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 127, pages 1-3.
    22. Gerber, Alan S. & Green, Donald P., 2000. "The Effects of Canvassing, Telephone Calls, and Direct Mail on Voter Turnout: A Field Experiment," American Political Science Review, Cambridge University Press, vol. 94(3), pages 653-663, September.
    23. Guala,Francesco, 2005. "The Methodology of Experimental Economics," Cambridge Books, Cambridge University Press, number 9780521618618.
    24. Lant Pritchett, Justin Sandefur, 2013. "Context Matters for Size: Why External Validity Claims and Development Practice Don't Mix-Working Paper 336," Working Papers 336, Center for Global Development.
    25. Lant Pritchett & Justin Sandefur, 2015. "Learning from Experiments When Context Matters," American Economic Review, American Economic Association, vol. 105(5), pages 471-475, May.
    26. Michael J. Weiss & Howard S. Bloom & Thomas Brock, 2014. "A Conceptual Framework For Studying The Sources Of Variation In Program Effects," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 33(3), pages 778-808, June.
    27. Alan Gerber & Donald Green, 2000. "The effects of canvassing, direct mail, and telephone contact on voter turnout: A field experiment," Natural Field Experiments 00248, The Field Experiments Website.
    28. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881.
    29. Elizabeth A. Stuart & Stephen R. Cole & Catherine P. Bradshaw & Philip J. Leaf, 2011. "The use of propensity scores to assess the generalizability of results from randomized trials," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 174(2), pages 369-386, April.
    30. Joshua D. Angrist & Jörn-Steffen Pischke, 2015. "The path from cause to effect: mastering 'metrics," CentrePiece - The magazine for economic performance 442, Centre for Economic Performance, LSE.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Esterling, Kevin M. & Brady, David & Schwitzgebel, Eric, 2023. "The Necessity of Construct and External Validity for Generalized Causal Claims," I4R Discussion Paper Series 18, The Institute for Replication (I4R).
    2. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    3. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    4. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    5. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    6. Ashis Das & Jed Friedman & Eeshani Kandpal, 2018. "Does involvement of local NGOs enhance public service delivery? Cautionary evidence from a malaria‐prevention program in India," Health Economics, John Wiley & Sons, Ltd., vol. 27(1), pages 172-188, January.
    7. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    8. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    9. Guido W. Imbens, 2020. "Potential Outcome and Directed Acyclic Graph Approaches to Causality: Relevance for Empirical Practice in Economics," Journal of Economic Literature, American Economic Association, vol. 58(4), pages 1129-1179, December.
    10. Faraz Usmani & Marc Jeuland & Subhrendu K. Pattanayak, 2018. "NGOs and the effectiveness of interventions," WIDER Working Paper Series wp-2018-59, World Institute for Development Economic Research (UNU-WIDER).
    11. Corduneanu-Huci, Cristina & Dorsch, Michael T. & Maarek, Paul, 2021. "The politics of experimentation: Political competition and randomized controlled trials," Journal of Comparative Economics, Elsevier, vol. 49(1), pages 1-21.
    12. Andor, Mark A. & Gerster, Andreas & Peters, Jörg & Schmidt, Christoph M., 2020. "Social Norms and Energy Conservation Beyond the US," Journal of Environmental Economics and Management, Elsevier, vol. 103(C).
    13. Andrews, Isaiah & Oster, Emily, 2019. "A simple approximation for evaluating external validity bias," Economics Letters, Elsevier, vol. 178(C), pages 58-62.
    14. Alex Eble & Peter Boone & Diana Elbourne, 2017. "On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics," The World Bank Economic Review, World Bank, vol. 31(3), pages 687-707.
    15. Naoki Egami & Erin Hartman, 2021. "Covariate selection for generalizing experimental results: Application to a large‐scale development program in Uganda," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(4), pages 1524-1548, October.
    16. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    17. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    18. Bayer, Patrick & Kennedy, Ryan & Yang, Joonseok & Urpelainen, Johannes, 2020. "The need for impact evaluation in electricity access research," Energy Policy, Elsevier, vol. 137(C).
    19. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    20. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:2s8w5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.