IDEAS home Printed from https://ideas.repec.org/p/ecl/harjfk/rwp13-012.html
   My bibliography  Save this paper

It's All about MeE: Using Structured Experiential Learning ("e") to Crawl the Design Space

Author

Listed:
  • Pritchett, Lant

    (Center for Global Development)

  • Samji, Salimah

    (Center for International Development, Harvard University)

  • Hammer, Jeffrey

    (Princeton University)

Abstract

There is an inherent tension between implementing organizations--which have specific objectives and narrow missions and mandates--and executive organizations--which provide resources to multiple implementing organizations. Ministries of finance/planning/budgeting allocate across ministries and projects/programs within ministries, development organizations allocate across sectors (and countries), foundations or philanthropies allocate across programs/grantees. Implementing organizations typically try to do the best they can with the funds they have and attract more resources, while executive organizations have to decide what and who to fund. Monitoring and Evaluation (M&E) has always been an element of the accountability of implementing organizations to their funders. There has been a recent trend towards much greater rigor in evaluations to isolate causal impacts of projects and programs and more 'evidence-based' approaches to accountability and budget allocations. Here we extend the basic idea of rigorous impact evaluation--the use of a valid counterfactual to make judgments about causality--to emphasize that the techniques of impact evaluation can be directly useful to implementing organizations (as opposed to impact evaluation being seen by implementing organizations as only an external threat to their funding). We introduce structured experiential learning (which we add to M&E to get MeE) which allows implementing agencies to actively and rigorously search across alternative project designs using the monitoring data that provides real-time performance information with direct feedback into the decision loops of project design and implementation. Our argument is that within-project variations in design can serve as their own counterfactual and this dramatically reduces the incremental cost of evaluation and increases the direct usefulness of evaluation to implementing agencies. The right combination of M, e, and E provides the right space for innovation and organizational capability building while at the same time providing accountability and an evidence base for funding agencies.

Suggested Citation

  • Pritchett, Lant & Samji, Salimah & Hammer, Jeffrey, 2013. "It's All about MeE: Using Structured Experiential Learning ("e") to Crawl the Design Space," Working Paper Series rwp13-012, Harvard University, John F. Kennedy School of Government.
  • Handle: RePEc:ecl:harjfk:rwp13-012
    as

    Download full text from publisher

    File URL: https://research.hks.harvard.edu/publications/workingpapers/citation.aspx?PubId=8975&type=WPN
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Hammer, Jeffrey S, 1997. "Economic Analysis for Health Projects," World Bank Research Observer, World Bank Group, vol. 12(1), pages 47-71, February.
    2. Baird, Sarah & McIntosh, Craig & Ozler, Berk, 2009. "Designing cost-effective cash transfer programs to boost schooling among young women in Sub-Saharan Africa," Policy Research Working Paper Series 5090, The World Bank.
    3. Ravallion Martin, 2015. "On the Implications of Essential Heterogeneity for Estimating Causal Impacts Using Social Experiments," Journal of Econometric Methods, De Gruyter, vol. 4(1), pages 1-7, January.
    4. Abhijit Banerjee & Raghabendra Chattopadhyay & Esther Duflo & Daniel Keniston & Nina Singh, 2012. "Can Institutions be Reformed from Within? Evidence from a Randomized Experiment with the Rajasthan Police," Working Papers id:4813, eSocialSciences.
    5. Spears, Dean, 2013. "Policy Lessons from the Implementation of India’s Total Sanitation Campaign," India Policy Forum, National Council of Applied Economic Research, vol. 9(1), pages 63-104.
    6. Andrews, Matt & Pritchett, Lant & Woolcock, Michael, 2013. "Escaping Capability Traps Through Problem Driven Iterative Adaptation (PDIA)," World Development, Elsevier, vol. 51(C), pages 234-244.
    7. Elizabeth M. King & Jere R. Behrman, 2009. "Timing and Duration of Exposure in Evaluations of Social Programs," World Bank Research Observer, World Bank Group, vol. 24(1), pages 55-82, February.
    8. Sanjeev Khagram & Craig Thomas & Catrina Lucero & Subarna Mathes, 2009. "Evidence for development effectiveness," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 1(3), pages 247-270.
    9. Marianne Bertrand & Dean Karlan & Sendhil Mullainathan & Eldar Shafir & Jonathan Zinman, 2010. "What's Advertising Content Worth? Evidence from a Consumer Credit Marketing Field Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 125(1), pages 263-306.
    10. White, Howard, 2006. "Impact evaluation: the experience of the Independent Evaluation Group of the World Bank," MPRA Paper 1111, University Library of Munich, Germany.
    11. Joshua Angrist & Eric Bettinger & Erik Bloom & Elizabeth King & Michael Kremer, 2002. "Vouchers for Private Schooling in Colombia: Evidence from a Randomized Natural Experiment," American Economic Review, American Economic Association, vol. 92(5), pages 1535-1558, December.
    12. Lant Pritchett & Michael Woolcock & Matt Andrews, 2013. "Looking Like a State: Techniques of Persistent Failure in State Capability for Implementation," Journal of Development Studies, Taylor & Francis Journals, vol. 49(1), pages 1-18, January.
    13. Abhijit Banerjee & Raghabendra Chattopadhyay & Esther Duflo & Daniel Keniston & Nina Singh, 2021. "Improving Police Performance in Rajasthan, India: Experimental Evidence on Incentives, Managerial Autonomy, and Training," American Economic Journal: Economic Policy, American Economic Association, vol. 13(1), pages 36-66, February.
    14. Henry Mintzberg & James A. Waters, 1985. "Of strategies, deliberate and emergent," Strategic Management Journal, Wiley Blackwell, vol. 6(3), pages 257-272, July.
    15. Jessica Cohen & Pascaline Dupas, 2010. "Free Distribution or Cost-Sharing? Evidence from a Randomized Malaria Prevention Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 125(1), pages 1-45.
    16. Acemoglu, Daron & Johnson, Simon & Robinson, James A., 2005. "Institutions as a Fundamental Cause of Long-Run Growth," Handbook of Economic Growth, in: Philippe Aghion & Steven Durlauf (ed.), Handbook of Economic Growth, edition 1, volume 1, chapter 6, pages 385-472, Elsevier.
    17. Benjamin A. Olken, 2007. "Monitoring Corruption: Evidence from a Field Experiment in Indonesia," Journal of Political Economy, University of Chicago Press, vol. 115, pages 200-249.
    18. Pritchett, Lant & Woolcock, Michael, 2004. "Solutions When the Solution is the Problem: Arraying the Disarray in Development," World Development, Elsevier, vol. 32(2), pages 191-212, February.
    19. Denizer, Cevdet & Kaufmann, Daniel & Kraay, Aart, 2013. "Good countries or good projects? Macro and micro correlates of World Bank project performance," Journal of Development Economics, Elsevier, vol. 105(C), pages 288-302.
    20. Das, Jishnu & Hammer, Jeffrey, 2007. "Money for nothing: The dire straits of medical practice in Delhi, India," Journal of Development Economics, Elsevier, vol. 83(1), pages 1-36, May.
    21. Dani Rodrik, 2007. "Introductiion to One Economics, Many Recipes: Globalization, Institutions, and Economic Growth," Introductory Chapters, in: One Economics, Many Recipes: Globalization, Institutions, and Economic Growth, Princeton University Press.
    22. Babalwa Zani & Elizabeth D Pienaar & Joy Oliver & Nandi Siegfried, 2011. "Randomized Controlled Trials of HIV/AIDS Prevention and Treatment in Africa: Results from the Cochrane HIV/AIDS Specialized Register," PLOS ONE, Public Library of Science, vol. 6(12), pages 1-9, December.
    23. Felipe Barrera-Osorio & Deon Filmer, 2016. "Incentivizing Schooling for Learning: Evidence on the Impact of Alternative Targeting Approaches," Journal of Human Resources, University of Wisconsin Press, vol. 51(2), pages 461-499.
    24. Ricardo Hausmann, 2008. "The Other Hand: High Bandwidth Development Policy," CID Working Papers 179, Center for International Development at Harvard University.
    25. Lant Pritchett, 2002. "It pays to be ignorant: A simple political economy of rigorous program evaluation," Journal of Economic Policy Reform, Taylor & Francis Journals, vol. 5(4), pages 251-269.
    26. repec:unu:wpaper:wp2012-63 is not listed on IDEAS
    27. Filmer, Deon & Hammer, Jeffrey S & Pritchett, Lant H, 2000. "Weak Links in the Chain: A Diagnosis of Health Policy in Poor Countries," World Bank Research Observer, World Bank Group, vol. 15(2), pages 199-224, August.
    28. Alaka Holla & Michael Kremer, 2009. "Lessons from Randomized Evaluations in Education and Health," Working Papers 158, Center for Global Development.
    29. Joshua Angrist & Ivan Fernandez-Val, 2010. "ExtrapoLATE-ing: External Validity and Overidentification in the LATE Framework," NBER Working Papers 16566, National Bureau of Economic Research, Inc.
    30. Deon Filmer & Jeffrey S. Hammer & Lant H. Pritchett, 2002. "Weak Links in the Chain II: A Prescription for Health Policy in Poor Countries," World Bank Research Observer, World Bank Group, vol. 17(1), pages 47-66.
    31. Miguel Szekely, 2011. "Toward Results-Based Social Policy Design and Implementation - Working Paper 249," Working Papers 249, Center for Global Development.
    32. Ostrom, Elinor, 1996. "Crossing the great divide: Coproduction, synergy, and development," World Development, Elsevier, vol. 24(6), pages 1073-1087, June.
    33. Devarajan, Shantayanan & Squire, Lyn & Suthiwart-Narueput, Sethaput, 1997. "Beyond Rate of Return: Reorienting Project Appraisal," World Bank Research Observer, World Bank Group, vol. 12(1), pages 35-46, February.
    34. Robert Jensen, 2010. "The (Perceived) Returns to Education and the Demand for Schooling," The Quarterly Journal of Economics, Oxford University Press, vol. 125(2), pages 515-548.
    35. Rodrik, Dani, 2008. "The New Development Economics: We Shall Experiment, but How Shall We Learn?," Working Paper Series rwp08-055, Harvard University, John F. Kennedy School of Government.
    36. Lant Pritchett, Michael Woolcock, Matt Andrews, 2010. "Capability Traps? The Mechanisms of Persistent Implementation Failure - Working Paper 234," Working Papers 234, Center for Global Development.
    37. Aart Kraay & Peter Murrell, 2016. "Misunderestimating Corruption," The Review of Economics and Statistics, MIT Press, vol. 98(3), pages 455-466, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    2. Arkedis, Jean & Creighton, Jessica & Dixit, Akshay & Fung, Archon & Kosack, Stephen & Levy, Dan & Tolmie, Courtney, 2021. "Can transparency and accountability programs improve health? Experimental evidence from Indonesia and Tanzania," World Development, Elsevier, vol. 142(C).
    3. Hammer, Jeffrey & Spears, Dean, 2013. "Village sanitation and children's human capital : evidence from a randomized experiment by the Maharashtra government," Policy Research Working Paper Series 6580, The World Bank.
    4. Florent BEDECARRATS & Isabelle GUERIN & François ROUBAUD, 2017. "L'étalon-or des évaluations randomisées : économie politique des expérimentations aléatoires dans le domaine du développement," Working Paper 753120cd-506f-4c5f-80ed-7, Agence française de développement.
    5. Woolcock, Michael, 2013. "Using Case Studies to Explore the External Validity of 'Complex' Development Interventions," Working Paper Series rwp13-048, Harvard University, John F. Kennedy School of Government.
    6. repec:pri:cheawb:tscjeff2013%20paper is not listed on IDEAS
    7. Holvoet, Nathalie & Van Esbroeck, Dirk & Inberg, Liesbeth & Popelier, Lisa & Peeters, Bob & Verhofstadt, Ellen, 2018. "To evaluate or not: Evaluability study of 40 interventions of Belgian development cooperation," Evaluation and Program Planning, Elsevier, vol. 67(C), pages 189-199.
    8. Nadel, Sara & Pritchett, Lant, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Paper Series rwp16-041, Harvard University, John F. Kennedy School of Government.
    9. Cameron, Lisa & Olivia, Susan & Shah, Manisha, 2019. "Scaling up sanitation: Evidence from an RCT in Indonesia," Journal of Development Economics, Elsevier, vol. 138(C), pages 1-16.
    10. Jorge Cornick & Alberto Trejos, 2016. "Building Public Capabilities for Productive Development Policies: Costa Rican Case Studies," IDB Publications (Working Papers) 96956, Inter-American Development Bank.
    11. Cornick, Jorge & Trejos, Alberto, 2018. "Building Public Capabilities for Productive Development Policies: Costa Rican Case Studies," IDB Publications (Working Papers) 8017, Inter-American Development Bank.
    12. Michael Clemens, Gabriel Demombynes, 2013. "The New Transparency in Development Economics: Lessons from the Millennium Villages Controversy," Working Papers 342, Center for Global Development.
    13. Woolcock, Michael, 2013. "Using Case Studies to Explore the External Validity of 'Complex' Development Interventions," Working Paper Series rwp13-048, Harvard University, John F. Kennedy School of Government.
    14. Janssen, Matthijs J., 2019. "What bangs for your buck? Assessing the design and impact of Dutch transformative policy," Technological Forecasting and Social Change, Elsevier, vol. 138(C), pages 78-94.
    15. Matthijs Janssen, 2016. "What bangs for your bucks? Assessing the design and impact of transformative policy," Innovation Studies Utrecht (ISU) working paper series 16-05, Utrecht University, Department of Innovation Studies, revised Dec 2016.
    16. Arkedis, Jean & Creighton, Jessica & Dixit, Akshay & Fung, Archon & Kosack, Stephen & Levy, Dan & Tolmie, Courtney, 2019. "Can Transparency and Accountability Programs Improve Health? Experimental Evidence from Indonesia and Tanzania," Working Paper Series rwp19-020, Harvard University, John F. Kennedy School of Government.
    17. Lant Pritchett, Justin Sandefur, 2013. "Context Matters for Size: Why External Validity Claims and Development Practice Don't Mix-Working Paper 336," Working Papers 336, Center for Global Development.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lant Pritchett & Salimah Samji & Jeffrey Hammer, 2012. "It’s All About MeE: Using Structured Experiential Learning (‘e’) to Crawl the Design Space," CID Working Papers 249, Center for International Development at Harvard University.
    2. repec:pri:rpdevs:hammer_its_all_about_me is not listed on IDEAS
    3. Andrews, Matt & Pritchett, Lant & Woolcock, Michael, 2017. "Building State Capability: Evidence, Analysis, Action," OUP Catalogue, Oxford University Press, number 9780198747482.
    4. William Easterly, 2009. "Can the West Save Africa?," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 373-447, June.
    5. Andrews, Matt & Pritchett, Lant & Woolcock, Michael, 2013. "Escaping Capability Traps Through Problem Driven Iterative Adaptation (PDIA)," World Development, Elsevier, vol. 51(C), pages 234-244.
    6. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," World Bank Research Observer, World Bank Group, vol. 33(1), pages 34-64.
    7. Andrews, Matt & Pritchett, Lant & Woolcock, Michael, 2015. "The Challenge of Building (Real) State Capability," Working Paper Series 15-074, Harvard University, John F. Kennedy School of Government.
    8. Larson, Greg & Ajak, Peter Biar & Pritchett, Lant, 2013. "South Sudan's Capability Trap: Building a State with Disruptive Innovation," Working Paper Series rwp13-041, Harvard University, John F. Kennedy School of Government.
    9. Lant Pritchett & Michael Woolcock & Matt Andrews, 2013. "Looking Like a State: Techniques of Persistent Failure in State Capability for Implementation," Journal of Development Studies, Taylor & Francis Journals, vol. 49(1), pages 1-18, January.
    10. Andrews, Matt & Pritchett, Lant & Woolcock, Michael, 2015. "Doing Problem Driven Work," Working Paper Series 15-073, Harvard University, John F. Kennedy School of Government.
    11. Peters, Jörg & Langbein, Jörg & Roberts, Gareth, 2016. "Policy evaluation, randomized controlled trials, and external validity—A systematic review," Economics Letters, Elsevier, vol. 147(C), pages 51-54.
    12. Temple, Jonathan R.W., 2010. "Aid and Conditionality," Handbook of Development Economics, in: Dani Rodrik & Mark Rosenzweig (ed.), Handbook of Development Economics, edition 1, volume 5, chapter 0, pages 4415-4523, Elsevier.
    13. Nadel, Sara & Pritchett, Lant, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Paper Series rwp16-041, Harvard University, John F. Kennedy School of Government.
    14. Matt Andrews & Lant Pritchett & Michael Woolcock, 2016. "The Big Stuck in State Capability for Policy Implementation," CID Working Papers 318, Center for International Development at Harvard University.
    15. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    16. Lant Pritchett & Michael Woolcock & Matt Andrews, 2013. "Looking Like a State: Techniques of Persistent Failure in State Capability for Implementation," Journal of Development Studies, Taylor & Francis Journals, vol. 49(1), pages 1-18, January.
    17. Karthik Muralidharan & Mauricio Romero & Kaspar Wüthrich, 2019. "Factorial Designs, Model Selection, and (Incorrect) Inference in Randomized Experiments," NBER Working Papers 26562, National Bureau of Economic Research, Inc.
    18. Rebecca Dizon-Ross & Pascaline Dupas & Jonathan Robinson, 2015. "Governance and the Effectiveness of Public Health Subsidies," NBER Working Papers 21324, National Bureau of Economic Research, Inc.
    19. Lant Pritchett, Justin Sandefur, 2013. "Context Matters for Size: Why External Validity Claims and Development Practice Don't Mix-Working Paper 336," Working Papers 336, Center for Global Development.
    20. Alejandro J. Ganimian & Richard J. Murnane, 2014. "Improving Educational Outcomes in Developing Countries: Lessons from Rigorous Impact Evaluations," NBER Working Papers 20284, National Bureau of Economic Research, Inc.
    21. Yanagihara, Toru, 2016. "User-Centered Approach to Service Quality and Outcome:Rationales, Accomplishments and Challenges," Working Papers 123, JICA Research Institute.

    More about this item

    JEL classification:

    • H43 - Public Economics - - Publicly Provided Goods - - - Project Evaluation; Social Discount Rate
    • L30 - Industrial Organization - - Nonprofit Organizations and Public Enterprise - - - General
    • O20 - Economic Development, Innovation, Technological Change, and Growth - - Development Planning and Policy - - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ecl:harjfk:rwp13-012. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://edirc.repec.org/data/ksharus.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (email available below). General contact details of provider: https://edirc.repec.org/data/ksharus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.