IDEAS home Printed from https://ideas.repec.org/
MyIDEAS: Login to save this paper or follow this series

It's All about MeE: Using Structured Experiential Learning ("e") to Crawl the Design Space

  • Pritchett, Lant

    (Center for Global Development)

  • Samji, Salimah

    (Center for International Development, Harvard University)

  • Hammer, Jeffrey

    (Princeton University)

There is an inherent tension between implementing organizations--which have specific objectives and narrow missions and mandates--and executive organizations--which provide resources to multiple implementing organizations. Ministries of finance/planning/budgeting allocate across ministries and projects/programs within ministries, development organizations allocate across sectors (and countries), foundations or philanthropies allocate across programs/grantees. Implementing organizations typically try to do the best they can with the funds they have and attract more resources, while executive organizations have to decide what and who to fund. Monitoring and Evaluation (M&E) has always been an element of the accountability of implementing organizations to their funders. There has been a recent trend towards much greater rigor in evaluations to isolate causal impacts of projects and programs and more 'evidence-based' approaches to accountability and budget allocations. Here we extend the basic idea of rigorous impact evaluation--the use of a valid counterfactual to make judgments about causality--to emphasize that the techniques of impact evaluation can be directly useful to implementing organizations (as opposed to impact evaluation being seen by implementing organizations as only an external threat to their funding). We introduce structured experiential learning (which we add to M&E to get MeE) which allows implementing agencies to actively and rigorously search across alternative project designs using the monitoring data that provides real-time performance information with direct feedback into the decision loops of project design and implementation. Our argument is that within-project variations in design can serve as their own counterfactual and this dramatically reduces the incremental cost of evaluation and increases the direct usefulness of evaluation to implementing agencies. The right combination of M, e, and E provides the right space for innovation and organizational capability building while at the same time providing accountability and an evidence base for funding agencies.

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.

File URL: https://research.hks.harvard.edu/publications/getFile.aspx?Id=931
Our checks indicate that this address may not be valid because: 500 Can't connect to research.hks.harvard.edu:443. If this is indeed the case, please notify ()


Download Restriction: no

Paper provided by Harvard University, John F. Kennedy School of Government in its series Working Paper Series with number rwp13-012.

as
in new window

Length:
Date of creation: May 2013
Date of revision:
Handle: RePEc:ecl:harjfk:rwp13-012
Contact details of provider: Postal: 79 JFK Street, Cambridge, MA 02138
Fax: 617-496-2554
Web page: http://www.ksg.harvard.edu/research/working_papers/index.htm
More information through EDIRC

References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:

as in new window
  1. Deon Filmer & Jeffrey S. Hammer & Lant H. Pritchett, 2002. "Weak Links in the Chain II: A Prescription for Health Policy in Poor Countries," World Bank Research Observer, World Bank Group, vol. 17(1), pages 47-66.
  2. Pritchett, Lant & Woolcock, Michael, 2004. "Solutions When the Solution is the Problem: Arraying the Disarray in Development," World Development, Elsevier, vol. 32(2), pages 191-212, February.
  3. Hausmann, Ricardo, 2008. "The Other Hand: High Bandwidth Development Policy," Working Paper Series rwp08-060, Harvard University, John F. Kennedy School of Government.
  4. Benjamin A. Olken, 2007. "Monitoring Corruption: Evidence from a Field Experiment in Indonesia," Journal of Political Economy, University of Chicago Press, vol. 115, pages 200-249.
  5. Marianne Bertrand & Dean Karlan & Sendhil Mullainathan & Eldar Shafir & Jonathan Zinman, 2010. "What's Advertising Content Worth? Evidence from a Consumer Credit Marketing Field Experiment," The Quarterly Journal of Economics, MIT Press, vol. 125(1), pages 263-305, February.
  6. Joshua Angrist & Ivan Fernandez-Val, 2010. "ExtrapoLATE-ing: External Validity and Overidentification in the LATE Framework," NBER Working Papers 16566, National Bureau of Economic Research, Inc.
  7. Marianne Bertrand & Dean S. Karlan & Sendhil Mullainathan & Eldar Shafir & Jonathan Zinman, 2005. "What's Psychology Worth? A Field Experiment in the Consumer Credit Market," Working Papers 918, Economic Growth Center, Yale University.
  8. Abhijit Banerjee & Raghabendra Chattopadhyay & Esther Duflo & Daniel Keniston & Nina Singh, 2012. "Can Institutions be Reformed from Within? Evidence from a Randomized Experiment with the Rajasthan Police," Working Papers id:4813, eSocialSciences.
  9. Pritchett, Lant & Andrews, Matthew R. & Woolcock, Michael J., 2012. "Escaping Capability Traps through Problem-Driven Iterative Adaptation (PDIA)," Scholarly Articles 9403175, Harvard Kennedy School of Government.
  10. Ostrom, Elinor, 1996. "Crossing the great divide: Coproduction, synergy, and development," World Development, Elsevier, vol. 24(6), pages 1073-1087, June.
  11. Kraay, Aart & Kraay, Aart & Murrell, Peter, 2013. "Misunderestimating corruption," Policy Research Working Paper Series 6488, The World Bank.
  12. Pritchett, Lant & Woolcock, Michael & Andrews, Matt, 2012. "Looking Like a State: Techniques of Persistent Failure in State Capability for Implementation," Working Paper Series UNU-WIDER Research Paper , World Institute for Development Economic Research (UNU-WIDER).
  13. Elizabeth M. King & Jere R. Behrman, 2009. "Timing and Duration of Exposure in Evaluations of Social Programs," World Bank Research Observer, World Bank Group, vol. 24(1), pages 55-82, February.
  14. Daron Acemoglu & Simon Johnson & James Robinson, 2004. "Institutions as the Fundamental Cause of Long-Run Growth," NBER Working Papers 10481, National Bureau of Economic Research, Inc.
  15. Abhijit Banerjee & Raghabendra Chattopadhyay & Esther Duflo & Daniel Keniston & Nina Singh, 2012. "Improving Police Performance in Rajasthan, India: Experimental Evidence on Incentives, Managerial Autonomy and Training," NBER Working Papers 17912, National Bureau of Economic Research, Inc.
  16. Robert Jensen, 2010. "The (Perceived) Returns to Education and the Demand for Schooling," The Quarterly Journal of Economics, MIT Press, vol. 125(2), pages 515-548, May.
  17. Ravallion, Martin, 2011. "On the implications of essential heterogeneity for estimating causal impacts using social experiments," Policy Research Working Paper Series 5804, The World Bank.
  18. Filmer, Deon & Hammer, Jeffrey S & Pritchett, Lant H, 2000. "Weak Links in the Chain: A Diagnosis of Health Policy in Poor Countries," World Bank Research Observer, World Bank Group, vol. 15(2), pages 199-224, August.
  19. Lant Pritchett, Michael Woolcock, Matt Andrews, 2010. "Capability Traps? The Mechanisms of Persistent Implementation Failure - Working Paper 234," Working Papers 234, Center for Global Development.
  20. Denizer, Cevdet & Kaufmann, Daniel & Kraay, Aart, 2013. "Good countries or good projects? Macro and micro correlates of World Bank project performance," Journal of Development Economics, Elsevier, vol. 105(C), pages 288-302.
  21. White, Howard, 2006. "Impact evaluation: the experience of the Independent Evaluation Group of the World Bank," MPRA Paper 1111, University Library of Munich, Germany.
  22. Das, Jishnu & Hammer, Jeffrey, 2007. "Money for nothing: The dire straits of medical practice in Delhi, India," Journal of Development Economics, Elsevier, vol. 83(1), pages 1-36, May.
  23. Jessica Cohen & Pascaline Dupas, 2010. "Free Distribution or Cost-Sharing? Evidence from a Randomized Malaria Prevention Experiment," The Quarterly Journal of Economics, MIT Press, vol. 125(1), pages 1-45, February.
  24. Hammer, Jeffrey S., 1996. "Economic analysis for health projects," Policy Research Working Paper Series 1611, The World Bank.
  25. Dani Rodrik, 2007. "Introductiion to One Economics, Many Recipes: Globalization, Institutions, and Economic Growth
    [One Economics, Many Recipes: Globalization, Institutions, and Economic Growth]
    ," Introductory Chapters, Princeton University Press.
  26. Joshua Angrist & Eric Bettinger & Erik Bloom & Elizabeth King & Michael Kremer, 2002. "Vouchers for Private Schooling in Colombia: Evidence from a Randomized Natural Experiment," American Economic Review, American Economic Association, vol. 92(5), pages 1535-1558, December.
  27. Barrera-Osorio, Felipe & Filmer, Deon, 2013. "Incentivizing schooling for learning : evidence on the impact of alternative targeting approaches," Policy Research Working Paper Series 6541, The World Bank.
  28. Devarajan, Shantayanan & Squire, Lyn & Suthiwart-Narueput, Sethaput, 1997. "Beyond Rate of Return: Reorienting Project Appraisal," World Bank Research Observer, World Bank Group, vol. 12(1), pages 35-46, February.
  29. Sanjeev Khagram & Craig Thomas & Catrina Lucero & Subarna Mathes, 2009. "Evidence for development effectiveness," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 1(3), pages 247-270.
  30. Baird, Sarah & McIntosh, Craig & Ozler, Berk, 2009. "Designing cost-effective cash transfer programs to boost schooling among young women in Sub-Saharan Africa," Policy Research Working Paper Series 5090, The World Bank.
  31. Rodrik, Dani, 2008. "The New Development Economics: We Shall Experiment, but How Shall We Learn?," Working Paper Series rwp08-055, Harvard University, John F. Kennedy School of Government.
  32. Lant Pritchett, 2002. "It pays to be ignorant: A simple political economy of rigorous program evaluation," Journal of Economic Policy Reform, Taylor & Francis Journals, vol. 5(4), pages 251-269.
Full references (including those not matched with items on IDEAS)

This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

When requesting a correction, please mention this item's handle: RePEc:ecl:harjfk:rwp13-012. See general information about how to correct material in RePEc.

For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ()

If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

If references are entirely missing, you can add them using this form.

If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

Please note that corrections may take a couple of weeks to filter through the various RePEc services.

This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.