IDEAS home Printed from https://ideas.repec.org/a/eee/wdevel/v99y2017icp173-185.html
   My bibliography  Save this article

How to Find out What’s Really Going On: Understanding Impact through Participatory Process Evaluation

Author

Listed:
  • Cornwall, Andrea
  • Aghajanian, Alia

Abstract

This article considers the contribution participatory process evaluation can make to impact assessment, using a case study of a study carried out to evaluate how a Kenyan nutrition education program had brought about change in the nutritional status of children and in their and their parents’ understanding and practices. Using Bhola’s three dimensions of impact—“impact by design”, “impact by interaction”, and “impact by emergence”—focuses not just on what changes as an intended result of an intervention, but on how change happens and how positive changes can be sustained. The principal focus of the article is methodological and as such it describes in some detail the development of a sequence of participatory visualization and discussion methods and their application with a range of stakeholders, from program staff in the headquarters of the implementing agency, to local government officials, front-line program workers, and beneficiaries. It suggests that the use of a participatory approach can enable researchers and evaluators to gain a fuller picture of incidental and unintended outcomes arising from interventions, making participatory process evaluation a valuable complement to other impact assessment methodologies.

Suggested Citation

  • Cornwall, Andrea & Aghajanian, Alia, 2017. "How to Find out What’s Really Going On: Understanding Impact through Participatory Process Evaluation," World Development, Elsevier, vol. 99(C), pages 173-185.
  • Handle: RePEc:eee:wdevel:v:99:y:2017:i:c:p:173-185
    DOI: 10.1016/j.worlddev.2017.07.010
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0305750X15304435
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.worlddev.2017.07.010?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Angus Deaton, 2009. "Instruments of development: Randomization in the tropics, and the search for the elusive keys to economic development," Working Papers 1128, Princeton University, Woodrow Wilson School of Public and International Affairs, Center for Health and Wellbeing..
    2. Emanuela Galasso & Nithin Umapathi, 2009. "Improving nutritional status through behavioural change: lessons from Madagascar," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 1(1), pages 60-85.
    3. Michael Woolcock, 2009. "Toward a plurality of methods in project evaluation: a contextualised approach to understanding impact trajectories and efficacy," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 1(1), pages 1-14.
    4. Pascaline Dupas, 2011. "Do Teenagers Respond to HIV Risk Information? Evidence from a Field Experiment in Kenya," American Economic Journal: Applied Economics, American Economic Association, vol. 3(1), pages 1-34, January.
    5. Sebastian Galiani & Paul Gertler & Ernesto Schargrodsky, 2005. "Water for Life: The Impact of the Privatization of Water Services on Child Mortality," Journal of Political Economy, University of Chicago Press, vol. 113(1), pages 83-120, February.
    6. Howard White & Edoardo Masset, 2007. "Assessing interventions to improve child nutrition: a theory-based impact evaluation of the Bangladesh Integrated Nutrition Project," Journal of International Development, John Wiley & Sons, Ltd., vol. 19(5), pages 627-652.
    7. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    8. repec:pri:cheawb:deaton%20instruments%20of%20development%20keynes%20lecture%202009.pdf is not listed on IDEAS
    9. repec:pri:rpdevs:instruments_of_development.pdf is not listed on IDEAS
    10. Howard White, 2009. "Theory-based impact evaluation: principles and practice," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 1(3), pages 271-284.
    11. Elizabeth Harrison, 2015. "Anthropology and impact evaluation: a critical commentary," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 7(2), pages 146-159, June.
    12. Linda Mayoux & Robert Chambers, 2005. "Reversing the paradigm: quantification, participatory methods and pro-poor impact assessment," Journal of International Development, John Wiley & Sons, Ltd., vol. 17(2), pages 271-298.
    13. Ravallion Martin, 2009. "Should the Randomistas Rule?," The Economists' Voice, De Gruyter, vol. 6(2), pages 1-5, February.
    14. Carvalho, S. & White, H., 1997. "Combining the Quantitative and Qualitative Approaches to Poverty Measurement and Analysis. The Practice and the Potential," Papers 366, World Bank - Technical Papers.
    15. repec:pri:cheawb:deaton%20instruments%20of%20development%20keynes%20lecture%202009 is not listed on IDEAS
    16. White, Howard, 2009. "Theory-Based Impact Evaluation," 3ie Publications 2009-3, International Initiative for Impact Evaluation (3ie).
    17. Vaessen, Jos, 2010. "Challenges in impact evaluation of development interventions: opportunities and limitations for randomized experiments," IOB Discussion Papers 2010.01, Universiteit Antwerpen, Institute of Development Policy (IOB).
    18. Howard White, 2011. "Achieving high-quality impact evaluation design through mixed methods: the case of infrastructure," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 3(1), pages 131-144.
    19. Paul J. Gertler & Sebastian Martinez & Patrick Premand & Laura B. Rawlings & Christel M. J. Vermeersch, . "Impact Evaluation in Practice, First Edition [La evaluación de impacto en la práctica]," World Bank Publications, The World Bank, number 2550, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Purkus, Alexandra & Lüdtke, Jan, 2020. "A systemic evaluation framework for a multi-actor, forest-based bioeconomy governance process: The German Charter for Wood 2.0 as a case study," Forest Policy and Economics, Elsevier, vol. 113(C).
    2. De Marinis, Pietro & Sali, Guido, 2020. "Participatory analytic hierarchy process for resource allocation in agricultural development projects," Evaluation and Program Planning, Elsevier, vol. 80(C).
    3. Alessia Spada & Mariantonietta Fiore & Umberto Monarca & Nicola Faccilongo, 2019. "R&D Expenditure for New Technology in Livestock Farming: Impact on GHG Reduction in Developing Countries," Sustainability, MDPI, vol. 11(24), pages 1-12, December.
    4. Dunne, Máiréad & Humphreys, Sara, 2022. "The edu-workscape: Re-conceptualizing the relationship between work and education in rural children’s lives in Sub-Saharan Africa," World Development Perspectives, Elsevier, vol. 27(C).
    5. Mieke Snijder & Rosie Steege & Michelle Callander & Michel Wahome & M. Feisal Rahman & Marina Apgar & Sally Theobald & Louise J. Bracken & Laura Dean & Bintu Mansaray & Prasanna Saligram & Surekha Gar, 2023. "How are Research for Development Programmes Implementing and Evaluating Equitable Partnerships to Address Power Asymmetries?," The European Journal of Development Research, Palgrave Macmillan;European Association of Development Research and Training Institutes (EADI), vol. 35(2), pages 351-379, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gwenolé Le Velly & Céline Dutilly, 2016. "Evaluating Payments for Environmental Services: Methodological Challenges," PLOS ONE, Public Library of Science, vol. 11(2), pages 1-20, February.
    2. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    3. William Easterly, 2009. "Can the West Save Africa?," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 373-447, June.
    4. Marian Meller & Stephan Litschig, 2014. "Saving Lives: Evidence from a Conditional Food Supplementation Program," Journal of Human Resources, University of Wisconsin Press, vol. 49(4), pages 1014-1052.
    5. Howard White, 2013. "An introduction to the use of randomised control trials to evaluate development interventions," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 5(1), pages 30-49, March.
    6. Sara Nadel and Lant Pritchett, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Papers 434, Center for Global Development.
    7. Henrik Hansen & Ole Winckler Andersen & Howard White, 2011. "Impact evaluation of infrastructure interventions," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 3(1), pages 1-8.
    8. Olofsgård, Anders, 2012. "The Politics of Aid Effectiveness: Why Better Tools can Make for Worse Outcomes," SITE Working Paper Series 16, Stockholm School of Economics, Stockholm Institute of Transition Economics.
    9. Bamberger, Michael & Tarsilla, Michele & Hesse-Biber, Sharlene, 2016. "Why so many “rigorous” evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 155-162.
    10. Sophie Webber, 2015. "Randomising Development: Geography, Economics and the Search for Scientific Rigour," Tijdschrift voor Economische en Sociale Geografie, Royal Dutch Geographical Society KNAG, vol. 106(1), pages 36-52, February.
    11. Johnson, Nancy L. & Atherstone, Christine & Grace, Delia, 2015. "The potential of farm-level technologies and practices to contribute to reducing consumer exposure to aflatoxins: A theory of change analysis:," IFPRI discussion papers 1452, International Food Policy Research Institute (IFPRI).
    12. Gisselquist, Rachel & Niño-Zarazúa, Miguel, 2013. "What can experiments tell us about how to improve governance?," MPRA Paper 49300, University Library of Munich, Germany.
    13. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    14. Mesnard, Alice & Vera-Hernández, Marcos & Fitzsimons, Emla & Malde, Bansi, 2012. "Household Responses to Information on Child Nutrition: Experimental Evidence from Malawi," CEPR Discussion Papers 8915, C.E.P.R. Discussion Papers.
    15. Sara Rafael Almeida & Joana Sousa Lourenco & Francois J. Dessart & Emanuele Ciriolo, 2017. "Insights from behavioural sciences to prevent and combat violence against women. Literature review," JRC Research Reports JRC103975, Joint Research Centre.
    16. Quentin Ssossé & Johanna Wagner & Carina Hopper, 2021. "Assessing the Impact of ESD: Methods, Challenges, Results," Sustainability, MDPI, vol. 13(5), pages 1-26, March.
    17. Joana Silva Afonso, 2020. "Impact evaluation, social performance assessment and standardisation: reflections from microfinance evaluations in Pakistan and Zimbabwe," Working Papers in Economics & Finance 2020-14, University of Portsmouth, Portsmouth Business School, Economics and Finance Subject Group.
    18. Judith Favereau & Nicolas Brisset, 2016. "Randomization of What? Moving from Libertarian to "Democratic Paternalism"," GREDEG Working Papers 2016-34, Groupe de REcherche en Droit, Economie, Gestion (GREDEG CNRS), Université Côte d'Azur, France.
    19. Malek, Mohammad Abdul & Saha, Ratnajit & Chowdhury, Priyanka & Khan, Tahsina & Mohammad, Ikhtiar, 2015. "Water quality information, WATSAN-agriculture hygiene messages and water testing with school students: Experimental evidence for behavioral changes in Bangladesh," 2015 Conference, August 9-14, 2015, Milan, Italy 211681, International Association of Agricultural Economists.
    20. Calina-Ana Butiu, 2017. "Evidence based practice in academic dropout policy. The pro-integra model," Journal of Community Positive Practices, Catalactica NGO, issue 1, pages 3-12.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:wdevel:v:99:y:2017:i:c:p:173-185. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/worlddev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.