IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/17062.html
   My bibliography  Save this paper

Mechanism Experiments and Policy Evaluations

Author

Listed:
  • Jens Ludwig
  • Jeffrey R. Kling
  • Sendhil Mullainathan

Abstract

Randomized controlled trials are increasingly used to evaluate policies. How can we make these experiments as useful as possible for policy purposes? We argue greater use should be made of experiments that identify behavioral mechanisms that are central to clearly specified policy questions, what we call "mechanism experiments." These types of experiments can be of great policy value even if the intervention that is tested (or its setting) does not correspond exactly to any realistic policy option.

Suggested Citation

  • Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011. "Mechanism Experiments and Policy Evaluations," NBER Working Papers 17062, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:17062
    Note: POL
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w17062.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    2. Bisakha Sen & Stephen Mennemeyer & Lisa C. Gary, 2009. "The Relationship Between Neighborhood Quality and Obesity Among Children," NBER Working Papers 14985, National Bureau of Economic Research, Inc.
    3. James J. Heckman, 2010. "Building Bridges between Structural and Program Evaluation Approaches to Evaluating Policy," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 356-398, June.
    4. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    5. Lisa Sanbonmatsu & Jeffrey R. Kling & Greg J. Duncan & Jeanne Brooks-Gunn, 2006. "Neighborhoods and Academic Achievement: Results from the Moving to Opportunity Experiment," Journal of Human Resources, University of Wisconsin Press, vol. 41(4).
    6. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records," American Economic Review, American Economic Association, vol. 80(3), pages 313-336, June.
    7. Jane G. Fortson & Lisa Sanbonmatsu, 2010. "Child Health and Neighborhood Conditions: Results from a Randomized Housing Voucher Experiment," Journal of Human Resources, University of Wisconsin Press, vol. 45(4), pages 840-864.
    8. DiNardo, John & Lee, David S., 2011. "Program Evaluation and Research Designs," Handbook of Labor Economics, Elsevier.
    9. Jeffrey R. Kling & Jens Ludwig & Lawrence F. Katz, 2005. "Neighborhood Effects on Crime for Female and Male Youth: Evidence from a Randomized Housing Voucher Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 120(1), pages 87-130.
    10. Esther Duflo & Emmanuel Saez, 2003. "The Role of Information and Social Interactions in Retirement Plan Decisions: Evidence from a Randomized Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 118(3), pages 815-842.
    11. Katz, Lawrence & Duncan, Greg J. & Kling, Jeffrey R. & Kessler, Ronald C. & Ludwig, Jens & Sanbonmatsu, Lisa & Liebman, Jeffrey B., 2008. "What Can We Learn about Neighborhood Effects from the Moving to Opportunity Experiment?," Scholarly Articles 2766959, Harvard University Department of Economics.
    12. Christopher Jepsen & Steven Rivkin, 2009. "Class Size Reduction and Student Achievement: The Potential Tradeoff between Teacher Quality and Class Size," Journal of Human Resources, University of Wisconsin Press, vol. 44(1).
    13. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    14. Meyer, Bruce D, 1995. "Natural and Quasi-experiments in Economics," Journal of Business & Economic Statistics, American Statistical Association, vol. 13(2), pages 151-161, April.
    15. Justine S. Hastings & Lydia Tejeda-Ashton, 2008. "Financial Literacy, Information, and Demand Elasticity: Survey and Experimental Evidence from Mexico," NBER Working Papers 14538, National Bureau of Economic Research, Inc.
    16. Elizabeth A. Stuart & Stephen R. Cole & Catherine P. Bradshaw & Philip J. Leaf, 2011. "The use of propensity scores to assess the generalizability of results from randomized trials," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 174(2), pages 369-386, April.
    17. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    18. Kenneth I. Wolpin, 2007. "Ex Ante Policy Evaluation, Structural Estimation and Model Selection," American Economic Review, American Economic Association, vol. 97(2), pages 48-52, May.
    19. Angrist, Joshua D, 1990. "Lifetime Earnings and the Vietnam Era Draft Lottery: Evidence from Social Security Administrative Records: Errata," American Economic Review, American Economic Association, vol. 80(5), pages 1284-1286, December.
    20. Philip Oreopoulos & Kjell G. Salvanes, 2009. "How large are returns to schooling? Hint: Money isn't everything," NBER Working Papers 15339, National Bureau of Economic Research, Inc.
    21. Edward P. Lazear, 2001. "Educational Production," The Quarterly Journal of Economics, Oxford University Press, vol. 116(3), pages 777-803.
    22. Anup Malani, 2006. "Identifying Placebo Effects with Data from Clinical Trials," Journal of Political Economy, University of Chicago Press, vol. 114(2), pages 236-256, April.
    Full references (including those not matched with items on IDEAS)

    More about this item

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:17062. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (). General contact details of provider: http://edirc.repec.org/data/nberrus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.