IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v59y2016icp109-118.html

Interfacing theories of program with theories of evaluation for advancing evaluation practice: Reductionism, systems thinking, and pragmatic synthesis

Author

Listed:
  • Chen, Huey T.

Abstract

Theories of program and theories of evaluation form the foundation of program evaluation theories. Theories of program reflect assumptions on how to conceptualize an intervention program for evaluation purposes, while theories of evaluation reflect assumptions on how to design useful evaluation. These two types of theories are related, but often discussed separately. This paper attempts to use three theoretical perspectives (reductionism, systems thinking, and pragmatic synthesis) to interface them and discuss the implications for evaluation practice. Reductionism proposes that an intervention program can be broken into crucial components for rigorous analyses; systems thinking view an intervention program as dynamic and complex, requiring a holistic examination. In spite of their contributions, reductionism and systems thinking represent the extreme ends of a theoretical spectrum; many real-world programs, however, may fall in the middle. Pragmatic synthesis is being developed to serve these moderate- complexity programs. These three theoretical perspectives have their own strengths and challenges. Knowledge on these three perspectives and their evaluation implications can provide a better guide for designing fruitful evaluations, improving the quality of evaluation practice, informing potential areas for developing cutting-edge evaluation approaches, and contributing to advancing program evaluation toward a mature applied science.

Suggested Citation

  • Chen, Huey T., 2016. "Interfacing theories of program with theories of evaluation for advancing evaluation practice: Reductionism, systems thinking, and pragmatic synthesis," Evaluation and Program Planning, Elsevier, vol. 59(C), pages 109-118.
  • Handle: RePEc:eee:epplan:v:59:y:2016:i:c:p:109-118
    DOI: 10.1016/j.evalprogplan.2016.05.012
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718916301008
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2016.05.012?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Glasgow, R.E. & Lichtenstein, E. & Marcus, A.C., 2003. "Why Don't We See More Translation of Health Promotion Research to Practice? Rethinking the Efficacy-to-Effectiveness Transition," American Journal of Public Health, American Public Health Association, vol. 93(8), pages 1261-1267.
    2. Chen, Huey T. & Yip, Fuyuen & Lavonas, Eric J. & Iqbal, Shahed & Turner, Nannette & Cobb, Bobby & Garbe, Paul, 2014. "Using the exhibited generalization approach to evaluate a carbon monoxide alarm ordinance," Evaluation and Program Planning, Elsevier, vol. 47(C), pages 35-44.
    3. Mingers, John & White, Leroy, 2010. "A review of the recent contribution of systems thinking to operational research and management science," European Journal of Operational Research, Elsevier, vol. 207(3), pages 1147-1161, December.
    4. Midgley, G., 2006. "Systemic intervention for public health," American Journal of Public Health, American Public Health Association, vol. 96(3), pages 466-472.
    5. Chen, Huey T., 2010. "The bottom-up approach to integrative validity: A new perspective for program evaluation," Evaluation and Program Planning, Elsevier, vol. 33(3), pages 205-214, August.
    6. Cabrera, Derek & Colosi, Laura & Lobdell, Claire, 2008. "Systems thinking," Evaluation and Program Planning, Elsevier, vol. 31(3), pages 299-310, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ann Svensson, 2020. "How to Evaluate Collaboration within Research and Innovation," Journal of International Business Research and Marketing, Inovatus Services Ltd., vol. 5(3), pages 21-25, March.
    2. Omid Ali Kharazmi & Amirali Kharazmi, 2022. "A pathological analysis of challenges related to systems thinking studies in Iran," Systems Research and Behavioral Science, Wiley Blackwell, vol. 39(2), pages 241-257, March.
    3. Ga‐Young So, 2024. "How does diversity affect the effectiveness of capacity building training? Evidence from the Republic of Korea," Development Policy Review, Overseas Development Institute, vol. 42(3), May.
    4. Goodier, Sarah & Field, Carren & Goodman, Suki, 2018. "The need for theory evaluation in global citizenship programmes: The case of the GCSA programme," Evaluation and Program Planning, Elsevier, vol. 66(C), pages 7-19.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Chen, Huey T. & Yip, Fuyuen & Lavonas, Eric J. & Iqbal, Shahed & Turner, Nannette & Cobb, Bobby & Garbe, Paul, 2014. "Using the exhibited generalization approach to evaluate a carbon monoxide alarm ordinance," Evaluation and Program Planning, Elsevier, vol. 47(C), pages 35-44.
    2. Céline Bérard & L.M., Cloutier & Luc Cassivi, 2017. "The effects of using system dynamics-based decision support models: testing policy-makers’ boundaries in a complex situation," Post-Print hal-02128255, HAL.
    3. Sambou, Césarine & Decroix, Charlotte & Martin-Fernandez, Judith & Cambon, Linda & Alla, François, 2025. "Uses of the viable validity concept: A systematic scoping review," Evaluation and Program Planning, Elsevier, vol. 108(C).
    4. Céline Bérard & Martin Cloutier L. & Luc Cassivi, 2017. "The effects of using system dynamics-based decision support models: testing policy-makers’ boundaries in a complex situation," Post-Print halshs-01666605, HAL.
    5. Gerald Midgley & Erik Lindhult, 2021. "A systems perspective on systemic innovation," Systems Research and Behavioral Science, Wiley Blackwell, vol. 38(5), pages 635-670, October.
    6. Karen Setty & Ryan Cronk & Shannan George & Darcy Anderson & Għanja O’Flaherty & Jamie Bartram, 2019. "Adapting Translational Research Methods to Water, Sanitation, and Hygiene," IJERPH, MDPI, vol. 16(20), pages 1-31, October.
    7. Diego Damásio Lima & Daniel Pacheco Lacerda & Miguel Afonso Sellitto, 2017. "Systemic Analysis of the Brazilian Production Chain of Semiconductors: Graphic Representation and Leverage Points," Systemic Practice and Action Research, Springer, vol. 30(3), pages 295-316, June.
    8. Gonot-Schoupinsky, Freda N. & Garip, Gulcan, 2019. "A flexible framework for planning and evaluating early-stage health interventions: FRAME-IT," Evaluation and Program Planning, Elsevier, vol. 77(C).
    9. Abuabara, Leila & Paucar-Caceres, Alberto, 2021. "Surveying applications of Strategic Options Development and Analysis (SODA) from 1989 to 2018," European Journal of Operational Research, Elsevier, vol. 292(3), pages 1051-1065.
    10. Emmanuel Njeuhmeli & Melissa Schnure & Andrea Vazzano & Elizabeth Gold & Peter Stegman & Katharine Kripke & Michel Tchuenche & Lori Bollinger & Steven Forsythe & Catherine Hankins, 2019. "Using mathematical modeling to inform health policy: A case study from voluntary medical male circumcision scale-up in eastern and southern Africa and proposed framework for success," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-15, March.
    11. Finch, Caroline F & Day, Lesley & Donaldson, Alex & Segal, Leonie & Harrison, James E, 2009. "Determining policy-relevant formats for the presentation of falls research evidence," Health Policy, Elsevier, vol. 93(2-3), pages 207-213, December.
    12. Cerezo, M. Angeles & Dasi, Carmen & Ruiz, Juan Carlos, 2013. "Supporting parenting of infants: Evaluating outcomes for parents and children in a community-based program," Evaluation and Program Planning, Elsevier, vol. 37(C), pages 12-20.
    13. Smith, Chris M. & Shaw, Duncan, 2019. "The characteristics of problem structuring methods: A literature review," European Journal of Operational Research, Elsevier, vol. 274(2), pages 403-416.
    14. Ning Zhang & Aziz Kemal Konyalıoğlu & Huabo Duan & Haibo Feng & Huanyu Li, 2024. "The impact of innovative technologies in construction activities on concrete debris recycling in China: a system dynamics-based analysis," Environment, Development and Sustainability: A Multidisciplinary Approach to the Theory and Practice of Sustainable Development, Springer, vol. 26(6), pages 14039-14064, June.
    15. Jacob Høgaard Christensen, 2022. "Enhancing mixed methods pragmatism with systems theory: Perspectives from educational research," Systems Research and Behavioral Science, Wiley Blackwell, vol. 39(1), pages 104-115, January.
    16. Saria Hassan & Alexis Cooke & Haneefa Saleem & Dorothy Mushi & Jessie Mbwambo & Barrot H. Lambdin, 2019. "Evaluating the Integrated Methadone and Anti-Retroviral Therapy Strategy in Tanzania Using the RE-AIM Framework," IJERPH, MDPI, vol. 16(5), pages 1-15, February.
    17. Melinda Craike & Bojana Klepac & Amy Mowle & Therese Riley, 2023. "Theory of systems change: An initial, middle-range theory of public health research impact," Research Evaluation, Oxford University Press, vol. 32(3), pages 603-621.
    18. Masoud Khakdaman & Wout Dullaert & Dirk Inghels & Marieke van Keeken & Pascal Wissink, 2024. "A System Dynamics Supply Chain Analysis for the Sustainability Transition of European Rolled Aluminum Products," Sustainability, MDPI, vol. 16(20), pages 1-27, October.
    19. Mingers, John, 2015. "Helping business schools engage with real problems: The contribution of critical realism and systems thinking," European Journal of Operational Research, Elsevier, vol. 242(1), pages 316-331.
    20. Archibald, Thomas, 2015. "“They Just Know”: The epistemological politics of “evidence-based” non-formal education," Evaluation and Program Planning, Elsevier, vol. 48(C), pages 137-148.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:59:y:2016:i:c:p:109-118. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.