IDEAS home Printed from https://ideas.repec.org/
MyIDEAS: Log in (now much improved!)

Citations for "Learning about Treatment Effects from Experiments with Random Assignment of Treatments"

by Charles F. Manski

For a complete description of this item, click here. For a RSS feed for citations of this item, click here.
as in new window

  1. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
  2. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2006. "Using State Administrative Data to Measure Program Performance," Working Papers 0702, Department of Economics, University of Missouri.
  3. John V. Pepper, 1999. "What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," Virginia Economics Online Papers 317, University of Virginia, Department of Economics.
  4. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
  5. Eric D. Gould & Victor Lavy & M. Daniele Paserman, 2004. "Immigrating to Opportunity: Estimating the Effect of School Quality Using a Natural Experiment on Ethiopians in Israel," The Quarterly Journal of Economics, Oxford University Press, vol. 119(2), pages 489-526.
  6. Manski, Charles F., 2002. "Identification of decision rules in experiments on simple games of proposal and response," European Economic Review, Elsevier, vol. 46(4-5), pages 880-891, May.
  7. Daniel A. Ackerberg & Matilde P. Machado & Michael H. Riordan, 2001. "Measuring the Relative Performance of Providers of a Health Service," NBER Working Papers 8385, National Bureau of Economic Research, Inc.
  8. Bronchetti, Erin Todd, 2012. "Workers' compensation and consumption smoothing," Journal of Public Economics, Elsevier, vol. 96(5), pages 495-508.
  9. Valdés, Nieves, 2008. "Did PROGRESA send drop-outs back to school?," UC3M Working papers. Economics we085926, Universidad Carlos III de Madrid. Departamento de Economía.
  10. Charles F. Manski & John V. Pepper, 1998. "Monotone Instrumental Variables: With an Application to the Returns to Schooling," Virginia Economics Online Papers 308, University of Virginia, Department of Economics.
  11. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-55, June.
  12. Fabio Soares & Yuri Suarez Dillon Soares, 2005. "The Socio-Economic Impact of Favela- Bairro: What do the Data Say?," OVE Working Papers 0805, Inter-American Development Bank, Office of Evaluation and Oversight (OVE).
  13. Jens Otto Ludwig & Greg Duncan & Joshua C. Pinkston, 2000. "Neighborhood Effects on Economic Self-Sufficiency: Evidence from a Randomized Housing-Mobility Experiment," JCPR Working Papers 159, Northwestern University/University of Chicago Joint Center for Poverty Research.
  14. Barbara Sianesi, 2014. "Dealing with randomisation bias in a social experiment: the case of ERA," IFS Working Papers W14/10, Institute for Fiscal Studies.
  15. Robert A. Pollak, 1998. "Notes on How Economists Think . . ," JCPR Working Papers 35, Northwestern University/University of Chicago Joint Center for Poverty Research.
  16. Cristian Aedo & Sergio Nuñez, 2004. "The Impact of Training Policies in Latin America and the Caribbean: The Case of Programa Joven," Research Department Publications 3175, Inter-American Development Bank, Research Department.
  17. Cristian Aedo & Sergio Nuñez, 2004. "Efectos de las políticas de capacitación en América Latina y el Caribe: el caso del Programa Joven," Research Department Publications 3176, Inter-American Development Bank, Research Department.
  18. Barbara Sianesi, 2013. "Dealing with randomisation bias in a social experiment exploiting the randomisation itself: the case of ERA," IFS Working Papers W13/15, Institute for Fiscal Studies.
  19. King , Elizabeth M. & Behrman, Jere R., 2008. "Timing and duration of exposure in evaluations of social programs," Policy Research Working Paper Series 4686, The World Bank.
  20. Bronchetti, Erin Todd, 2014. "Public insurance expansions and the health of immigrant and native children," Journal of Public Economics, Elsevier, vol. 120(C), pages 205-219.
  21. Greg Duncan & Stephen W. Raudenbush, 1998. "Neighborhoods and Adolescent Development: How Can We Determine the Links?," JCPR Working Papers 59, Northwestern University/University of Chicago Joint Center for Poverty Research.
  22. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
  23. Manski, Charles F., 2000. "Identification problems and decisions under ambiguity: Empirical analysis of treatment response and normative analysis of treatment choice," Journal of Econometrics, Elsevier, vol. 95(2), pages 415-442, April.
  24. Asensio, Omar Isaac & Delmas, Magali A., 2016. "The dynamics of behavior change: Evidence from energy conservation," Journal of Economic Behavior & Organization, Elsevier, vol. 126(PA), pages 196-212.
  25. Andreas Hildenbrand & Rainer Kühl & Anne Piper, 2016. "On the Credibility Determinants of a Quality Label: a Quasi-Natural Experiment Using the Example of Stiftung Warentest," Journal of Consumer Policy, Springer, vol. 39(3), pages 307-325, September.
  26. John V. Pepper, 2003. "Using Experiments to Evaluate Performance Standards: What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," Journal of Human Resources, University of Wisconsin Press, vol. 38(4).
This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.