IDEAS home Printed from https://ideas.repec.org/r/aea/aecrev/v85y1995i4p923-37.html
   My bibliography  Save this item

Evaluating Program Evaluations: New Evidence on Commonly Used Nonexperimental Methods

Citations

Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as


Cited by:

  1. Fitzenberger, Bernd & Prey, Hedwig, 1996. "Training in East Germany: An evaluation of the effects on employment and wages," Discussion Papers 36, University of Konstanz, Center for International Labor Economics (CILE).
  2. Astrid Grasdal, 2001. "The performance of sample selection estimators to control for attrition bias," Health Economics, John Wiley & Sons, Ltd., vol. 10(5), pages 385-398.
  3. Carlos A. Flores & Oscar A. Mitnik, 2013. "Comparing Treatments across Labor Markets: An Assessment of Nonexperimental Multiple-Treatment Strategies," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1691-1707, December.
  4. World Bank, 2001. "Brazil : Eradicating Child Labor in Brazil," World Bank Other Operational Studies 15465, The World Bank.
  5. Fortson, Kenneth & Gleason, Philip & Kopa, Emma & Verbitsky-Savitz, Natalya, 2015. "Horseshoes, hand grenades, and treatment effects? Reassessing whether nonexperimental estimators are biased," Economics of Education Review, Elsevier, vol. 44(C), pages 100-113.
  6. Dolton, Peter & Smith, Jeffrey A., 2011. "The Impact of the UK New Deal for Lone Parents on Benefit Receipt," IZA Discussion Papers 5491, Institute for the Study of Labor (IZA).
  7. Arpino, Bruno & Mealli, Fabrizia, 2011. "The specification of the propensity score in multilevel observational studies," Computational Statistics & Data Analysis, Elsevier, pages 1770-1780.
  8. Robert J. LaLonde, 2003. "Employment and Training Programs," NBER Chapters,in: Means-Tested Transfer Programs in the United States, pages 517-586 National Bureau of Economic Research, Inc.
  9. Flores, Carlos A. & Mitnik, Oscar A., 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," IZA Discussion Papers 4451, Institute for the Study of Labor (IZA).
  10. Milu Muyanga & John Olwande & Esther Mueni & Stella Wambugu, 2010. "Free Primary Education in Kenya: An Impact Evaluation Using Propensity Score Methods," Working Papers PMMA 2010-08, PEP-PMMA.
  11. Michael J. Camasso, 2004. "Isolating the Family Cap Effect on Fertility Behavior: Evidence From New Jersey's Family Development Program Experiment," Contemporary Economic Policy, Western Economic Association International, vol. 22(4), pages 453-467, October.
  12. Hervé Cardot & Antonio Musolesi, 2017. "Modeling temporal treatment effects with zero inflated semi-parametric regression models: the case of local development policies in France," SEEDS Working Papers 0317, SEEDS, Sustainability Environmental Economics and Dynamics Studies, revised Aug 2017.
  13. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
  14. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
  15. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics,in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097 Elsevier.
  16. Burda, Martin & Harding, Matthew & Hausman, Jerry, 2008. "A Bayesian mixed logit-probit model for multinomial choice," Journal of Econometrics, Elsevier, pages 232-246.
  17. Crombrugghe Denis de & Espinoza Henry & Heijke Hans, 2010. "Job-training programmes with low completion rates: The case of Projoven-Peru," ROA Research Memorandum 004, Maastricht University, Research Centre for Education and the Labour Market (ROA).
  18. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.
  19. Priya Nanda, 1999. "Women's participation in rural credit programmes in Bangladesh and their demand for formal health care: is there a positive impact? 1," Health Economics, John Wiley & Sons, Ltd., vol. 8(5), pages 415-428.
  20. Valdivia, Martín, 2009. "Contracting the road to development: early impacts of a rural roads program," Socioeconomic research working papers 203, CAF Development Bank Of Latinamerica.
  21. Antonio Trujillo & Jorge Portillo & John Vernon, 2005. "The Impact of Subsidized Health Insurance for the Poor: Evaluating the Colombian Experience Using Propensity Score Matching," International Journal of Health Economics and Management, Springer, vol. 5(3), pages 211-239, September.
  22. Paxson, Christina*Schady, Norbert, 1999. "Do school facilities matter? : the case of the Peruvian Social Fund (FONCODES)," Policy Research Working Paper Series 2229, The World Bank.
  23. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
  24. Arpino, Bruno & Mealli, Fabrizia, 2011. "The specification of the propensity score in multilevel observational studies," Computational Statistics & Data Analysis, Elsevier, pages 1770-1780.
  25. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, pages 5-86.
  26. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
  27. Sudhanshu Handa & John A. Maluccio, 2010. "Matching the Gold Standard: Comparing Experimental and Nonexperimental Evaluation Techniques for a Geographically Targeted Program," Economic Development and Cultural Change, University of Chicago Press, pages 415-447.
  28. Richard P. Nathan, 2008. "The role of random assignment in social policy research," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 27(2), pages 401-415.
IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.