IDEAS home Printed from
MyIDEAS: Login

Citations for "Evaluating Program Evaluations: New Evidence on Commonly Used Nonexperimental Methods"

by Friedlander, Daniel & Robins, Philip K

For a complete description of this item, click here. For a RSS feed for citations of this item, click here.
as in new window

  1. Astrid Grasdal, 2001. "The performance of sample selection estimators to control for attrition bias," Health Economics, John Wiley & Sons, Ltd., vol. 10(5), pages 385-398.
  2. Imbens, Guido W. & Wooldridge, Jeffrey M., 2008. "Recent Developments in the Econometrics of Program Evaluation," IZA Discussion Papers 3640, Institute for the Study of Labor (IZA).
  3. Fitzenberger, Bernd & Prey, Hedwig, 1996. "Training in East Germany: An evaluation of the effects on employment and wages," Discussion Papers 36, University of Konstanz, Center for International Labor Economics (CILE).
  4. Flores, Carlos A. & Mitnik, Oscar A., 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," IZA Discussion Papers 4451, Institute for the Study of Labor (IZA).
  5. Arpino, Bruno & Mealli, Fabrizia, 2011. "The specification of the propensity score in multilevel observational studies," Computational Statistics & Data Analysis, Elsevier, vol. 55(4), pages 1770-1780, April.
  6. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters, in: Means-Tested Transfer Programs in the United States, pages 517-586 National Bureau of Economic Research, Inc.
  7. Priya Nanda, 1999. "Women's participation in rural credit programmes in Bangladesh and their demand for formal health care: is there a positive impact? 1," Health Economics, John Wiley & Sons, Ltd., vol. 8(5), pages 415-428.
  8. Carlos A. Flores & Oscar A. Mitnik, 2011. "Comparing Treatments across Labor Markets: An Assessment of Nonexperimental Multiple-Treatment Strategies," Working Papers 2011-10, University of Miami, Department of Economics.
  9. Lechner, Michael & Wunsch, Conny, 2011. "Sensitivity of matching-based program evaluations to the availability of control variables," CEPR Discussion Papers 8294, C.E.P.R. Discussion Papers.
  10. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.
  11. Jeffrey Smith & Petra Todd, 2003. "Does Matching Overcome Lalonde's Critique of Nonexperimental Estimators?," University of Western Ontario, CIBC Centre for Human Capital and Productivity Working Papers 20035, University of Western Ontario, CIBC Centre for Human Capital and Productivity.
  12. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097 Elsevier.
  13. World Bank, 2001. "Brazil : Eradicating Child Labor in Brazil," World Bank Other Operational Studies 15465, The World Bank.
  14. Antonio Trujillo & Jorge Portillo & John Vernon, 2005. "The Impact of Subsidized Health Insurance for the Poor: Evaluating the Colombian Experience Using Propensity Score Matching," International Journal of Health Care Finance and Economics, Springer, vol. 5(3), pages 211-239, September.
  15. Sudhanshu Handa & John Maluccio, 2008. "Matching the gold standard: Comparing experimental and non-experimental evaluation techniques for a geographically targeted program," Middlebury College Working Paper Series 0813, Middlebury College, Department of Economics.
  16. Richard P. Nathan, 2008. "The role of random assignment in social policy research," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(2), pages 401-415.
  17. Paxson, Christina & Schady, Norbert, 1999. "Do school facilities matter? : the case of the Peruvian Social Fund (FONCODES)," Policy Research Working Paper Series 2229, The World Bank.
  18. Dolton, Peter & Smith, Jeffrey A., 2011. "The Impact of the UK New Deal for Lone Parents on Benefit Receipt," IZA Discussion Papers 5491, Institute for the Study of Labor (IZA).
  19. Crombrugghe Denis de & Espinoza Henry & Heijke Hans, 2010. "Job-training programmes with low completion rates: The case of Projoven-Peru," ROA Research Memorandum 004, Maastricht University, Research Centre for Education and the Labour Market (ROA).
  20. Fortson, Kenneth & Gleason, Philip & Kopa, Emma & Verbitsky-Savitz, Natalya, 2015. "Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased," Economics of Education Review, Elsevier, vol. 44(C), pages 100-113.
  21. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
  22. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
  23. Martin Valdivia, 2010. "Contracting the Road to Development: Early Impacts of a Rural Roads Program," Working Papers PMMA 2010-18, PEP-PMMA.
  24. Milu Muyanga & John Olwande & Esther Mueni & Stella Wambugu, 2010. "Free Primary Education in Kenya: An Impact Evaluation Using Propensity Score Methods," Working Papers PMMA 2010-08, PEP-PMMA.
This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.