IDEAS home Printed from https://ideas.repec.org/p/feb/natura/00256.html
   My bibliography  Save this paper

Retrospective vs. prospective analyses of school inputs: The case of flip charts in kenya

Author

Listed:
  • Paul Glewwe
  • Michael Kremer
  • Sylvie Moulin
  • Eric Zitzewitz

Abstract

This paper compares retrospective and prospective analyses of the effect of flip charts on test scores in rural Kenyan schools. Retrospective estimates that focus on subjects for which flip charts are used suggest that flip charts raise test scores by up to 20 percent of a standard deviation. Controlling for other educational inputs does not reduce this estimate. In contrast, prospective estimators based on a study of 178 schools, half of which were randomly selected to receive charts, provide no evidence that flip charts increase test scores. One interpretation is that the retrospective results were subject to omitted variable bias despite the inclusion of control variables. If the direction of omitted variable bias were similar in other retrospective analyses of educational inputs in developing countries, the effects of inputs may be even more modest than retrospective studies suggest. Bias appears to be reduced by a differences-in-differences estimator that examines the impact of flip charts on the relative performance of students in flip chart and other subjects across schools with and without flip charts, but it is not clear that this approach is applicable more generally.

Suggested Citation

  • Paul Glewwe & Michael Kremer & Sylvie Moulin & Eric Zitzewitz, 2004. "Retrospective vs. prospective analyses of school inputs: The case of flip charts in kenya," Natural Field Experiments 00256, The Field Experiments Website.
  • Handle: RePEc:feb:natura:00256
    as

    Download full text from publisher

    File URL: http://s3.amazonaws.com/fieldexperiments-papers2/papers/00256.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Robert J. LaLonde, 1984. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," Working Papers 563, Princeton University, Department of Economics, Industrial Relations Section..
    2. Hanushek, Eric A, 1995. "Interpreting Recent Research on Schooling in Developing Countries," The World Bank Research Observer, World Bank, vol. 10(2), pages 227-246, August.
    3. Krueger, Alan B & Whitmore, Diane M, 2001. "The Effect of Attending a Small Class in the Early Grades on College-Test Taking and Middle School Test Results: Evidence from Project STAR," Economic Journal, Royal Economic Society, vol. 111(468), pages 1-28, January.
    4. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    5. Eric A. Hanushek & Steven G. Rivkin, 1996. "Understanding the 20th Century Growth in U.S. School Spending," NBER Working Papers 5547, National Bureau of Economic Research, Inc.
    6. Hanushek, Eric A, 1986. "The Economics of Schooling: Production and Efficiency in Public Schools," Journal of Economic Literature, American Economic Association, vol. 24(3), pages 1141-1177, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Abhijit V. Banerjee & Shawn Cole & Esther Duflo & Leigh Linden, 2007. "Remedying Education: Evidence from Two Randomized Experiments in India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(3), pages 1235-1264.
    2. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    3. Peter Hull & Michal Kolesár & Christopher Walters, 2022. "Labor by design: contributions of David Card, Joshua Angrist, and Guido Imbens," Scandinavian Journal of Economics, Wiley Blackwell, vol. 124(3), pages 603-645, July.
    4. Wößmann, Ludger, 2001. "New Evidence on the Missing Resource-Performance Link in Education," Kiel Working Papers 1051, Kiel Institute for the World Economy (IfW Kiel).
    5. Kim, Jooseop, 1998. "Two case studies on impact evaluation of education projects," ISU General Staff Papers 1998010108000012622, Iowa State University, Department of Economics.
    6. Joshua D. Angrist, 2022. "Empirical Strategies in Economics: Illuminating the Path From Cause to Effect," Econometrica, Econometric Society, vol. 90(6), pages 2509-2539, November.
    7. Charles T. Clotfelter & Helen F. Ladd & Jacob L. Vigdor, 2006. "Teacher-Student Matching and the Assessment of Teacher Effectiveness," Journal of Human Resources, University of Wisconsin Press, vol. 41(4).
    8. Larru, Jose Maria, 2007. "La evaluación de impacto: qué es, cómo se mide y qué está aportando en la cooperación al desarrollo [Impact Assessment and Evaluation: What it is it, how can it be measured and what it is adding to," MPRA Paper 6928, University Library of Munich, Germany.
    9. Thomas D. Cook, 2003. "Why have Educational Evaluators Chosen Not to Do Randomized Experiments?," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 114-149, September.
    10. Tarek Azzam & Michael Bates & David Fairris, 2019. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202002, University of California at Riverside, Department of Economics.
    11. Wo[ss]mann, Ludger & West, Martin, 2006. "Class-size effects in school systems around the world: Evidence from between-grade variation in TIMSS," European Economic Review, Elsevier, vol. 50(3), pages 695-736, April.
    12. Baird, Matthew D. & Engberg, John & Gutierrez, Italo A., 2022. "RCT evidence on differential impact of US job training programmes by pre-training employment status," Labour Economics, Elsevier, vol. 75(C).
    13. Dettmann, E. & Becker, C. & Schmeißer, C., 2011. "Distance functions for matching in small samples," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1942-1960, May.
    14. David Card, 2022. "Design-Based Research in Empirical Microeconomics," Working Papers 654, Princeton University, Department of Economics, Industrial Relations Section..
    15. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    16. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    17. Sant’Anna, Pedro H.C. & Zhao, Jun, 2020. "Doubly robust difference-in-differences estimators," Journal of Econometrics, Elsevier, vol. 219(1), pages 101-122.
    18. Kitagawa, Toru & Muris, Chris, 2016. "Model averaging in semiparametric estimation of treatment effects," Journal of Econometrics, Elsevier, vol. 193(1), pages 271-289.
    19. Rajeev Dehejia, 2013. "The Porous Dialectic: Experimental and Non-Experimental Methods in Development Economics," WIDER Working Paper Series wp-2013-011, World Institute for Development Economic Research (UNU-WIDER).
    20. Michael Gerfin & Michael Lechner, 2002. "A Microeconometric Evaluation of the Active Labour Market Policy in Switzerland," Economic Journal, Royal Economic Society, vol. 112(482), pages 854-893, October.

    More about this item

    JEL classification:

    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education
    • N37 - Economic History - - Labor and Consumers, Demography, Education, Health, Welfare, Income, Wealth, Religion, and Philanthropy - - - Africa; Oceania

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:feb:natura:00256. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Francesca Pagnotta (email available below). General contact details of provider: http://www.fieldexperiments.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.