IDEAS home Printed from https://ideas.repec.org/
MyIDEAS: Login

Citations for "Evaluating Program Evaluations: New Evidence on Commonly Used Nonexperimental Methods"

by Friedlander, Daniel & Robins, Philip K

For a complete description of this item, click here. For a RSS feed for citations of this item, click here.
as in new window

  1. Wooldridge, Jeffrey M. & Imbens, Guido, 2009. "Recent Developments in the Econometrics of Program Evaluation," Scholarly Articles 3043416, Harvard University Department of Economics.
  2. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
  3. Grasdal, A., 2001. "The Performance of Sample Selection Estimators to Control for Attrition Bias," Norway; Department of Economics, University of Bergen 225, Department of Economics, University of Bergen.
  4. Sudhanshu Handa & John Maluccio, 2008. "Matching the gold standard: Comparing experimental and non-experimental evaluation techniques for a geographically targeted program," Middlebury College Working Paper Series 0813, Middlebury College, Department of Economics.
  5. World Bank, 2001. "Brazil : Eradicating Child Labor in Brazil," World Bank Other Operational Studies 15465, The World Bank.
  6. Peter R. Mueser & Kenneth Troske & Alexey Gorislavsky, 2003. "Using State Administrative Data to Measure Program Performance," Working Papers 0309, Department of Economics, University of Missouri.
  7. Carlos A. Flores & Oscar A. Mitnik, 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," Working Papers 2010-9, University of Miami, Department of Economics.
  8. Arpino, Bruno & Mealli, Fabrizia, 2008. "The specification of the propensity score in multilevel observational studies," MPRA Paper 17407, University Library of Munich, Germany.
  9. Dolton, Peter & Smith, Jeffrey A., 2011. "The Impact of the UK New Deal for Lone Parents on Benefit Receipt," IZA Discussion Papers 5491, Institute for the Study of Labor (IZA).
  10. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
  11. Richard P. Nathan, 2008. "The role of random assignment in social policy research," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(2), pages 401-415.
  12. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
  13. Milu Muyanga & John Olwande & Esther Mueni & Stella Wambugu, 2010. "Free Primary Education in Kenya: An Impact Evaluation Using Propensity Score Methods," Working Papers PMMA 2010-08, PEP-PMMA.
  14. Priya Nanda, 1999. "Women's participation in rural credit programmes in Bangladesh and their demand for formal health care: is there a positive impact? 1," Health Economics, John Wiley & Sons, Ltd., vol. 8(5), pages 415-428.
  15. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586 National Bureau of Economic Research, Inc.
  16. Carlos A. Flores & Oscar A. Mitnik, 2013. "Comparing Treatments across Labor Markets: An Assessment of Nonexperimental Multiple-Treatment Strategies," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1691-1707, December.
  17. Fitzenberger, Bernd & Prey, Hedwig, 1996. "Training in East Germany: An evaluation of the effects on employment and wages," Discussion Papers 36, University of Konstanz, Center for International Labor Economics (CILE).
  18. Martin Valdivia, 2010. "Contracting the Road to Development: Early Impacts of a Rural Roads Program," Working Papers PMMA 2010-18, PEP-PMMA.
  19. Crombrugghe Denis de & Espinoza Henry & Heijke Hans, 2010. "Job-training programmes with low completion rates: The case of Projoven-Peru," ROA Research Memorandum 004, Maastricht University, Research Centre for Education and the Labour Market (ROA).
  20. Fortson, Kenneth & Gleason, Philip & Kopa, Emma & Verbitsky-Savitz, Natalya, 2015. "Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased," Economics of Education Review, Elsevier, vol. 44(C), pages 100-113.
  21. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.
  22. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097 Elsevier.
  23. Paxson, Christina & Schady, Norbert, 1999. "Do school facilities matter? : the case of the Peruvian Social Fund (FONCODES)," Policy Research Working Paper Series 2229, The World Bank.
  24. Antonio Trujillo & Jorge Portillo & John Vernon, 2005. "The Impact of Subsidized Health Insurance for the Poor: Evaluating the Colombian Experience Using Propensity Score Matching," International Journal of Health Care Finance and Economics, Springer, vol. 5(3), pages 211-239, September.
This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.