IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/17062.html
   My bibliography  Save this paper

Mechanism Experiments and Policy Evaluations

Author

Listed:
  • Jens Ludwig
  • Jeffrey R. Kling
  • Sendhil Mullainathan

Abstract

Randomized controlled trials are increasingly used to evaluate policies. How can we make these experiments as useful as possible for policy purposes? We argue greater use should be made of experiments that identify behavioral mechanisms that are central to clearly specified policy questions, what we call "mechanism experiments." These types of experiments can be of great policy value even if the intervention that is tested (or its setting) does not correspond exactly to any realistic policy option.

Suggested Citation

  • Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011. "Mechanism Experiments and Policy Evaluations," NBER Working Papers 17062, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:17062
    Note: POL
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w17062.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    2. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    3. Bisakha Sen & Stephen Mennemeyer & Lisa C. Gary, 2009. "The Relationship Between Neighborhood Quality and Obesity Among Children," NBER Working Papers 14985, National Bureau of Economic Research, Inc.
    4. Alan B. Krueger, 1999. "Experimental Estimates of Education Production Functions," The Quarterly Journal of Economics, Oxford University Press, vol. 114(2), pages 497-532.
    5. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    6. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    7. Lisa Sanbonmatsu & Jeffrey R. Kling & Greg J. Duncan & Jeanne Brooks-Gunn, 2006. "Neighborhoods and Academic Achievement: Results from the Moving to Opportunity Experiment," Journal of Human Resources, University of Wisconsin Press, vol. 41(4).
    8. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records," American Economic Review, American Economic Association, vol. 80(3), pages 313-336, June.
    9. Jane G. Fortson & Lisa Sanbonmatsu, 2010. "Child Health and Neighborhood Conditions: Results from a Randomized Housing Voucher Experiment," Journal of Human Resources, University of Wisconsin Press, vol. 45(4), pages 840-864.
    10. DiNardo, John & Lee, David S., 2011. "Program Evaluation and Research Designs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 4, chapter 5, pages 463-536, Elsevier.
    11. Jeffrey R. Kling & Jens Ludwig & Lawrence F. Katz, 2005. "Neighborhood Effects on Crime for Female and Male Youth: Evidence from a Randomized Housing Voucher Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 120(1), pages 87-130.
    12. David M. Cutler & Edward L. Glaeser & Jesse M. Shapiro, 2003. "Why Have Americans Become More Obese?," Journal of Economic Perspectives, American Economic Association, vol. 17(3), pages 93-118, Summer.
    13. Esther Duflo & Emmanuel Saez, 2003. "The Role of Information and Social Interactions in Retirement Plan Decisions: Evidence from a Randomized Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 118(3), pages 815-842.
    14. Katz, Lawrence & Duncan, Greg J. & Kling, Jeffrey R. & Kessler, Ronald C. & Ludwig, Jens & Sanbonmatsu, Lisa & Liebman, Jeffrey B., 2008. "What Can We Learn about Neighborhood Effects from the Moving to Opportunity Experiment?," Scholarly Articles 2766959, Harvard University Department of Economics.
    15. Justine S. Hastings & Jeffrey M. Weinstein, 2008. "Information, School Choice, and Academic Achievement: Evidence from Two Experiments," The Quarterly Journal of Economics, Oxford University Press, vol. 123(4), pages 1373-1414.
    16. Jeffrey R Kling & Jeffrey B Liebman & Lawrence F Katz, 2007. "Experimental Analysis of Neighborhood Effects," Econometrica, Econometric Society, vol. 75(1), pages 83-119, January.
    17. Edward P. Lazear, 2001. "Educational Production," The Quarterly Journal of Economics, Oxford University Press, vol. 116(3), pages 777-803.
    18. Christopher Jepsen & Steven Rivkin, 2009. "Class Size Reduction and Student Achievement: The Potential Tradeoff between Teacher Quality and Class Size," Journal of Human Resources, University of Wisconsin Press, vol. 44(1).
    19. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    20. Meyer, Bruce D, 1995. "Natural and Quasi-experiments in Economics," Journal of Business & Economic Statistics, American Statistical Association, vol. 13(2), pages 151-161, April.
    21. Kling, Jeffrey R., 2007. "Methodological Frontiers of Public Finance Field Experiments," National Tax Journal, National Tax Association;National Tax Journal, vol. 60(1), pages 109-127, March.
    22. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    23. Justine S. Hastings & Lydia Tejeda-Ashton, 2008. "Financial Literacy, Information, and Demand Elasticity: Survey and Experimental Evidence from Mexico," NBER Working Papers 14538, National Bureau of Economic Research, Inc.
    24. Elizabeth A. Stuart & Stephen R. Cole & Catherine P. Bradshaw & Philip J. Leaf, 2011. "The use of propensity scores to assess the generalizability of results from randomized trials," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 174(2), pages 369-386, April.
    25. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    26. Petra E. Todd & Kenneth I. Wolpin, 2008. "Ex Ante Evaluation of Social Programs," Annals of Economics and Statistics, GENES, issue 91-92, pages 263-291.
    27. Jeffrey R. Kling, 2006. "Incarceration Length, Employment, and Earnings," American Economic Review, American Economic Association, vol. 96(3), pages 863-876, June.
    28. Kenneth I. Wolpin, 2007. "Ex Ante Policy Evaluation, Structural Estimation and Model Selection," American Economic Review, American Economic Association, vol. 97(2), pages 48-52, May.
    29. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records: Errata," American Economic Review, American Economic Association, vol. 80(5), pages 1284-1286, December.
    30. repec:adr:anecst:y:2008:i:91-92:p:13 is not listed on IDEAS
    31. Philip Oreopoulos & Kjell G. Salvanes, 2009. "How large are returns to schooling? Hint: Money isn't everything," NBER Working Papers 15339, National Bureau of Economic Research, Inc.
    32. repec:mpr:mprres:7081 is not listed on IDEAS
    33. John DiNardo & David S. Lee, 2010. "Program Evaluation and Research Designs," Working Papers 1228, Princeton University, Department of Economics, Industrial Relations Section..
    34. Anup Malani, 2006. "Identifying Placebo Effects with Data from Clinical Trials," Journal of Political Economy, University of Chicago Press, vol. 114(2), pages 236-256, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kugler Franziska & Schwerdt Guido & Wößmann Ludger, 2014. "Ökonometrische Methoden zur Evaluierung kausaler Effekte der Wirtschaftspolitik," Perspektiven der Wirtschaftspolitik, De Gruyter, vol. 15(2), pages 105-132, June.
    2. Maibom, Jonas, 2021. "The Danish Labor Market Experiments: Methods and Findings," Nationaløkonomisk tidsskrift, Nationaløkonomisk Forening, vol. 2021(1), pages 1-21.
    3. Committee, Nobel Prize, 2021. "Answering causal questions using observational data," Nobel Prize in Economics documents 2021-2, Nobel Prize Committee.
    4. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    5. Duo Qin & Yanqun Zhang, 2013. "A History of Polyvalent Structural Parameters: the Case of Instrument Variable Estimators," Working Papers 183, Department of Economics, SOAS, University of London, UK.
    6. Dionissi Aliprantis, 2011. "Assessing the evidence on neighborhood effects from moving to opportunity," Working Papers (Old Series) 1101, Federal Reserve Bank of Cleveland.
    7. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    8. Jennifer Gippel & Tom Smith & Yushu Zhu, 2015. "Endogeneity in Accounting and Finance Research: Natural Experiments as a State-of-the-Art Solution," Abacus, Accounting Foundation, University of Sydney, vol. 51(2), pages 143-168, June.
    9. Dionissi Aliprantis, 2013. "Covariates and causal effects: the problem of context," Working Papers (Old Series) 1310, Federal Reserve Bank of Cleveland.
    10. Baldwin, Kate & Bhavnani, Rikhil R., 2013. "Ancillary Experiments: Opportunities and Challenges," WIDER Working Paper Series 024, World Institute for Development Economic Research (UNU-WIDER).
    11. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    12. Fougère, Denis & Jacquemet, Nicolas, 2020. "Policy Evaluation Using Causal Inference Methods," IZA Discussion Papers 12922, Institute of Labor Economics (IZA).
    13. Ludwig, Jens & Duncan, Greg J. & Katz, Lawrence F. & Kessler, Ronald & Kling, Jeffrey R. & Gennetian, Lisa A. & Sanbonmatsu, Lisa, 2012. "Neighborhood Effects on the Long-Term Well-Being of Low-Income Adults," Scholarly Articles 11870359, Harvard University Department of Economics.
    14. Guido W. Imbens, 2020. "Potential Outcome and Directed Acyclic Graph Approaches to Causality: Relevance for Empirical Practice in Economics," Journal of Economic Literature, American Economic Association, vol. 58(4), pages 1129-1179, December.
    15. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    16. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    17. Kate Baldwin & Rikhil R. Bhavnani, 2013. "Ancillary Experiments: Opportunities and Challenges," WIDER Working Paper Series wp-2013-024, World Institute for Development Economic Research (UNU-WIDER).
    18. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    19. Moshe Justman, 2016. "Economic Research and Education Policy: Project STAR and Class Size Reduction," Melbourne Institute Working Paper Series wp2016n37, Melbourne Institute of Applied Economic and Social Research, The University of Melbourne.
    20. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:17062. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.