Ten steps to making evaluation matter
AbstractThis paper proposes ten steps to make evaluations matter. The ten steps are a combination of the usual recommended practice such as developing program theory and implementing rigorous evaluation designs with a stronger focus on more unconventional steps including developing learning frameworks, exploring pathways of evaluation influence, and assessing spread and sustainability. Consideration of these steps can lead to a focused dialogue between program planners and evaluators and can result in more rigorously planned programs. The ten steps can also help in developing and implementing evaluation designs that have greater potential for policy and programmatic influence. The paper argues that there is a need to go beyond a formulaic approach to program evaluation design that often does not address the complexity of the programs. The complexity of the program will need to inform the design of the evaluation. The ten steps that are described in this paper are heavily informed by a Realist approach to evaluation. The Realist approach attempts to understand what is it about a program that makes it work.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
Bibliographic InfoArticle provided by Elsevier in its journal Evaluation and Program Planning.
Volume (Year): 34 (2011)
Issue (Month): 2 (May)
Contact details of provider:
Web page: http://www.elsevier.com/locate/evalprogplan
Evaluation design Program theory Pathways of influence Learning frameworks Design Learning Spread;
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Michael Woolcock, 2009.
"Towards a Plurality of Methods in Project Evaluation: A Contextualised Approach to Understanding Impact Trajectories and Efficacy,"
Brooks World Poverty Institute Working Paper Series
7309, BWPI, The University of Manchester.
- Michael Woolcock, 2009. "Toward a plurality of methods in project evaluation: a contextualised approach to understanding impact trajectories and efficacy," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 1(1), pages 1-14.
- Johnson, Knowlton & Hays, Carol & Center, Hayden & Daley, Charlotte, 2004. "Building capacity and sustainable prevention innovations: a sustainability planning model," Evaluation and Program Planning, Elsevier, vol. 27(2), pages 135-149, May.
- Pluye, Pierre & Potvin, Louise & Denis, Jean-Louis, 2004. "Making public health programs last: conceptualizing sustainability," Evaluation and Program Planning, Elsevier, vol. 27(2), pages 121-133, May.
- Clark, Alexander M., 2013. "What are the components of complex interventions in healthcare? Theorizing approaches to parts, powers and the whole intervention," Social Science & Medicine, Elsevier, vol. 93(C), pages 185-193.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Zhang, Lei).
If references are entirely missing, you can add them using this form.