IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v59y2016icp109-118.html
   My bibliography  Save this article

Interfacing theories of program with theories of evaluation for advancing evaluation practice: Reductionism, systems thinking, and pragmatic synthesis

Author

Listed:
  • Chen, Huey T.

Abstract

Theories of program and theories of evaluation form the foundation of program evaluation theories. Theories of program reflect assumptions on how to conceptualize an intervention program for evaluation purposes, while theories of evaluation reflect assumptions on how to design useful evaluation. These two types of theories are related, but often discussed separately. This paper attempts to use three theoretical perspectives (reductionism, systems thinking, and pragmatic synthesis) to interface them and discuss the implications for evaluation practice. Reductionism proposes that an intervention program can be broken into crucial components for rigorous analyses; systems thinking view an intervention program as dynamic and complex, requiring a holistic examination. In spite of their contributions, reductionism and systems thinking represent the extreme ends of a theoretical spectrum; many real-world programs, however, may fall in the middle. Pragmatic synthesis is being developed to serve these moderate- complexity programs. These three theoretical perspectives have their own strengths and challenges. Knowledge on these three perspectives and their evaluation implications can provide a better guide for designing fruitful evaluations, improving the quality of evaluation practice, informing potential areas for developing cutting-edge evaluation approaches, and contributing to advancing program evaluation toward a mature applied science.

Suggested Citation

  • Chen, Huey T., 2016. "Interfacing theories of program with theories of evaluation for advancing evaluation practice: Reductionism, systems thinking, and pragmatic synthesis," Evaluation and Program Planning, Elsevier, vol. 59(C), pages 109-118.
  • Handle: RePEc:eee:epplan:v:59:y:2016:i:c:p:109-118
    DOI: 10.1016/j.evalprogplan.2016.05.012
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718916301008
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2016.05.012?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Chen, Huey T., 2010. "The bottom-up approach to integrative validity: A new perspective for program evaluation," Evaluation and Program Planning, Elsevier, vol. 33(3), pages 205-214, August.
    2. Cabrera, Derek & Colosi, Laura & Lobdell, Claire, 2008. "Systems thinking," Evaluation and Program Planning, Elsevier, vol. 31(3), pages 299-310, August.
    3. Mingers, John & White, Leroy, 2010. "A review of the recent contribution of systems thinking to operational research and management science," European Journal of Operational Research, Elsevier, vol. 207(3), pages 1147-1161, December.
    4. Midgley, G., 2006. "Systemic intervention for public health," American Journal of Public Health, American Public Health Association, vol. 96(3), pages 466-472.
    5. Chen, Huey T. & Yip, Fuyuen & Lavonas, Eric J. & Iqbal, Shahed & Turner, Nannette & Cobb, Bobby & Garbe, Paul, 2014. "Using the exhibited generalization approach to evaluate a carbon monoxide alarm ordinance," Evaluation and Program Planning, Elsevier, vol. 47(C), pages 35-44.
    6. Glasgow, R.E. & Lichtenstein, E. & Marcus, A.C., 2003. "Why Don't We See More Translation of Health Promotion Research to Practice? Rethinking the Efficacy-to-Effectiveness Transition," American Journal of Public Health, American Public Health Association, vol. 93(8), pages 1261-1267.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ann Svensson, 2020. "How to Evaluate Collaboration within Research and Innovation," Journal of International Business Research and Marketing, Inovatus Services Ltd., vol. 5(3), pages 21-25, March.
    2. Omid Ali Kharazmi & Amirali Kharazmi, 2022. "A pathological analysis of challenges related to systems thinking studies in Iran," Systems Research and Behavioral Science, Wiley Blackwell, vol. 39(2), pages 241-257, March.
    3. Goodier, Sarah & Field, Carren & Goodman, Suki, 2018. "The need for theory evaluation in global citizenship programmes: The case of the GCSA programme," Evaluation and Program Planning, Elsevier, vol. 66(C), pages 7-19.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Céline Bérard & L.M., Cloutier & Luc Cassivi, 2017. "The effects of using system dynamics-based decision support models: testing policy-makers’ boundaries in a complex situation," Post-Print hal-02128255, HAL.
    2. Gerald Midgley & Erik Lindhult, 2021. "A systems perspective on systemic innovation," Systems Research and Behavioral Science, Wiley Blackwell, vol. 38(5), pages 635-670, October.
    3. Chen, Huey T. & Yip, Fuyuen & Lavonas, Eric J. & Iqbal, Shahed & Turner, Nannette & Cobb, Bobby & Garbe, Paul, 2014. "Using the exhibited generalization approach to evaluate a carbon monoxide alarm ordinance," Evaluation and Program Planning, Elsevier, vol. 47(C), pages 35-44.
    4. Céline Bérard & Martin Cloutier L. & Luc Cassivi, 2017. "The effects of using system dynamics-based decision support models: testing policy-makers’ boundaries in a complex situation," Post-Print halshs-01666605, HAL.
    5. Gonot-Schoupinsky, Freda N. & Garip, Gulcan, 2019. "A flexible framework for planning and evaluating early-stage health interventions: FRAME-IT," Evaluation and Program Planning, Elsevier, vol. 77(C).
    6. Emmanuel Njeuhmeli & Melissa Schnure & Andrea Vazzano & Elizabeth Gold & Peter Stegman & Katharine Kripke & Michel Tchuenche & Lori Bollinger & Steven Forsythe & Catherine Hankins, 2019. "Using mathematical modeling to inform health policy: A case study from voluntary medical male circumcision scale-up in eastern and southern Africa and proposed framework for success," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-15, March.
    7. Finch, Caroline F & Day, Lesley & Donaldson, Alex & Segal, Leonie & Harrison, James E, 2009. "Determining policy-relevant formats for the presentation of falls research evidence," Health Policy, Elsevier, vol. 93(2-3), pages 207-213, December.
    8. Smith, Chris M. & Shaw, Duncan, 2019. "The characteristics of problem structuring methods: A literature review," European Journal of Operational Research, Elsevier, vol. 274(2), pages 403-416.
    9. Saria Hassan & Alexis Cooke & Haneefa Saleem & Dorothy Mushi & Jessie Mbwambo & Barrot H. Lambdin, 2019. "Evaluating the Integrated Methadone and Anti-Retroviral Therapy Strategy in Tanzania Using the RE-AIM Framework," IJERPH, MDPI, vol. 16(5), pages 1-15, February.
    10. Archibald, Thomas, 2015. "“They Just Know”: The epistemological politics of “evidence-based” non-formal education," Evaluation and Program Planning, Elsevier, vol. 48(C), pages 137-148.
    11. Trutnevyte, Evelina & Stauffacher, Michael & Scholz, Roland W., 2012. "Linking stakeholder visions with resource allocation scenarios and multi-criteria assessment," European Journal of Operational Research, Elsevier, vol. 219(3), pages 762-772.
    12. Urban, Jennifer Brown & Hargraves, Monica & Trochim, William M., 2014. "Evolutionary Evaluation: Implications for evaluators, researchers, practitioners, funders and the evidence-based program mandate," Evaluation and Program Planning, Elsevier, vol. 45(C), pages 127-139.
    13. Rodney J. Scott & Robert Y. Cavana & Donald Cameron, 2016. "Client Perceptions of Reported Outcomes of Group Model Building in the New Zealand Public Sector," Group Decision and Negotiation, Springer, vol. 25(1), pages 77-101, January.
    14. Nam C. Nguyen & Ockie J. H. Bosch, 2014. "The Art of Interconnected Thinking: Starting with the Young," Challenges, MDPI, vol. 5(2), pages 1-21, August.
    15. Miguel Afonso Sellitto & Guilherme Schreiber Pereira & Rafael Marques & Daniel Pacheco Lacerda, 2018. "Systemic Understanding of Coopetitive Behaviour in a Latin American Technological Park," Systemic Practice and Action Research, Springer, vol. 31(5), pages 479-494, October.
    16. Wasserman, Deborah L., 2010. "Using a systems orientation and foundational theory to enhance theory-driven human service program evaluations," Evaluation and Program Planning, Elsevier, vol. 33(2), pages 67-80, May.
    17. Estabrooks, Carole A. & Norton, Peter & Birdsell, Judy M. & Newton, Mandi S. & Adewale, Adeniyi J. & Thornley, Richard, 2008. "Knowledge translation and research careers: Mode I and Mode II activity among health researchers," Research Policy, Elsevier, vol. 37(6-7), pages 1066-1078, July.
    18. Natalie Bradford & Shirley Chambers & Adrienne Hudson & Jacqui Jauncey‐Cooke & Robyn Penny & Carol Windsor & Patsy Yates, 2019. "Evaluation frameworks in health services: An integrative review of use, attributes and elements," Journal of Clinical Nursing, John Wiley & Sons, vol. 28(13-14), pages 2486-2498, July.
    19. Chen, Huey T., 2010. "The bottom-up approach to integrative validity: A new perspective for program evaluation," Evaluation and Program Planning, Elsevier, vol. 33(3), pages 205-214, August.
    20. Rieckmann, Traci R. & Kovas, Anne E. & Cassidy, Elaine F. & McCarty, Dennis, 2011. "Employing policy and purchasing levers to increase the use of evidence-based practices in community-based substance abuse treatment settings: Reports from single state authorities," Evaluation and Program Planning, Elsevier, vol. 34(4), pages 366-374, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:59:y:2016:i:c:p:109-118. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.