IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v83y2020ics0149718920301762.html

Teaching programme evaluation: A problem of knowledge

Author

Listed:
  • Arbour, Ghislain

Abstract

This article conceptualises the problem of selecting teaching content that supports the practice of programme evaluation. Knowledge for evaluation practice falls within one of three categories of knowledge that are defined by the different roles they play in supporting practice. First, core knowledge relates to the defining activity of evaluation practice, i.e., that it informs the intellectual task of the determination of a programme’s value. Second, accessory knowledge informs activities that support and facilitate the concretisation of the previous activity in a delivery context (e.g., stakeholder participation, evaluation use, project management, etc.). Third and finally, supplementary knowledge informs activities that may, on occasion, occur during evaluation practice, but without relating to the determination of value, either inherently or in a support role. The selection of knowledge for the teaching of evaluation must match the knowledge needed for the pursuit of effective evaluation practice: core, accessory, and supplementary knowledge. The specifics of these three needs ultimately depend on the characteristics of a given practice. The selection of content for the teaching of evaluation should ideally address these specific needs with the best knowledge available, regardless of its disciplinary origins.

Suggested Citation

  • Arbour, Ghislain, 2020. "Teaching programme evaluation: A problem of knowledge," Evaluation and Program Planning, Elsevier, vol. 83(C).
  • Handle: RePEc:eee:epplan:v:83:y:2020:i:c:s0149718920301762
    DOI: 10.1016/j.evalprogplan.2020.101872
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718920301762
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2020.101872?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Gullickson, Amy M., 2020. "The whole elephant: Defining evaluation," Evaluation and Program Planning, Elsevier, vol. 79(C).
    2. Patton, Michael Quinn & Horton, Douglas, 2008. "Utilization-focused evaluation for agricultural innovation," ILAC Briefs 52533, Institutional Learning and Change (ILAC) Initiative.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Downes, Jenni & Gullickson, Amy M., 2022. "What does it mean for an evaluation to be ‘valid’? A critical synthesis of evaluation literature," Evaluation and Program Planning, Elsevier, vol. 91(C).
    2. Zhi Yang & Susan Whatman, 2025. "Development and validation of standards for evaluating the quality of qualitative research on Olympics breakdance," Humanities and Social Sciences Communications, Palgrave Macmillan, vol. 12(1), pages 1-14, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Abdul-Manan, Amir F.N. & Baharuddin, Azizan & Chang, Lee Wei, 2015. "Application of theory-based evaluation for the critical analysis of national biofuel policy: A case study in Malaysia," Evaluation and Program Planning, Elsevier, vol. 52(C), pages 39-49.
    2. Romero-Gutierrez, Miguel & Jimenez-Liso, M. Rut & Martinez-Chico, Maria, 2016. "SWOT analysis to evaluate the programme of a joint online/onsite master's degree in environmental education through the students’ perceptions," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 41-49.
    3. Ofek, Yuval, 2017. "Evaluating social exclusion interventions in university-community partnerships," Evaluation and Program Planning, Elsevier, vol. 60(C), pages 46-55.
    4. Martens, Krystin S.R., 2018. "How program evaluators use and learn to use rubrics to make evaluative reasoning explicit," Evaluation and Program Planning, Elsevier, vol. 69(C), pages 25-32.
    5. Telch, Fabian, 2025. "Understanding how national development planning (NDP) shapes public institutions and procedures for development: the case of Colombia," World Development Perspectives, Elsevier, vol. 39(C).
    6. Bourgeois, Isabelle & Whynot, Jane, 2018. "The influence of evaluation recommendations on instrumental and conceptual uses: A preliminary analysis," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 13-18.
    7. Lifshitz, Chen Chana, 2017. "Fostering employability among youth at-risk in a multi-cultural context: Insights from a pilot intervention program," Children and Youth Services Review, Elsevier, vol. 76(C), pages 20-34.
    8. LaVelle, John M. & Davies, Randall, 2021. "Seeking consensus: Defining foundational concepts for a graduate level introductory program evaluation course," Evaluation and Program Planning, Elsevier, vol. 88(C).
    9. Melz, Heidi & Fromknecht, Anne E. & Masters, Loren D. & Richards, Tammy & Sun, Jing, 2023. "Incorporating multiple data sources to assess changes in organizational capacity in child welfare systems," Evaluation and Program Planning, Elsevier, vol. 97(C).
    10. Healy, John & Hughes, Jeffrey & Donnelly-Cox, Gemma & Shantz, Amanda, 2024. "A long and winding road: The hard graft of scaling social change in complex systems," Journal of Business Venturing Insights, Elsevier, vol. 21(C).
    11. Wingate, Lori A. & Smith, Nick L. & Perk, Emma, 2018. "The project vita: A dynamic knowledge management tool," Evaluation and Program Planning, Elsevier, vol. 71(C), pages 22-27.
    12. Metta, Matteo & Ciliberti, Stefano & Obi, Chinedu & Bartolini, Fabio & Klerkx, Laurens & Brunori, Gianluca, 2022. "An integrated socio-cyber-physical system framework to assess responsible digitalisation in agriculture: A first application with Living Labs in Europe," Agricultural Systems, Elsevier, vol. 203(C).
    13. Mikko V. Pohjola & Pasi Pohjola & Marko Tainio & Jouni T. Tuomisto, 2013. "Perspectives to Performance of Environment and Health Assessments and Models—From Outputs to Outcomes?," IJERPH, MDPI, vol. 10(7), pages 1-22, June.
    14. Peterson, Christina & Skolits, Gary, 2020. "Value for money: A utilization-focused approach to extending the foundation and contribution of economic evaluation," Evaluation and Program Planning, Elsevier, vol. 80(C).
    15. Sharma, Bhanu & Robinson, Jackie & Arhen, Benjamin B. & Timmons, Brian W. & Heal, Bryan & Warner, Marika, 2025. "Evaluating sport-for-development outcome measures used in a living lab setting: Process, improvements, and insights," Evaluation and Program Planning, Elsevier, vol. 112(C).
    16. Kalpazidou Schmidt, Evanthia & Graversen, Ebbe Krogh, 2020. "Developing a conceptual evaluation framework for gender equality interventions in research and innovation," Evaluation and Program Planning, Elsevier, vol. 79(C).
    17. Hudon, Catherine & Chouinard, Maud-Christine & Brousselle, Astrid & Bisson, Mathieu & Danish, Alya, 2020. "Evaluating complex interventions in real context: Logic analysis of a case management program for frequent users of healthcare services," Evaluation and Program Planning, Elsevier, vol. 79(C).
    18. Jabeen, Sumera, 2018. "Unintended outcomes evaluation approach: A plausible way to evaluate unintended outcomes of social development programmes," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 262-274.
    19. Jan Činčera & Grzegorz Mikusiński & Bohuslav Binka & Luis Calafate & Cristina Calheiros & Alexandra Cardoso & Marcus Hedblom & Michael Jones & Alex Koutsouris & Clara Vasconcelos & Katarzyna Iwińska, 2019. "Managing Diversity: The Challenges of Inter-University Cooperation in Sustainability Education," Sustainability, MDPI, vol. 11(20), pages 1-16, October.
    20. Daigneault, Pierre-Marc, 2014. "Taking stock of four decades of quantitative research on stakeholder participation and evaluation use: A systematic map," Evaluation and Program Planning, Elsevier, vol. 45(C), pages 171-181.

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:83:y:2020:i:c:s0149718920301762. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.