IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v76y2019ic2.html

Evaluating unintended program outcomes through Ripple Effects Mapping (REM): Application of REM using grounded theory

Author

Listed:
  • Peterson, Christina
  • Skolits, Gary

Abstract

Several evaluation models exist for investigating unintended outcomes, including goal-free and systems evaluation. Yet methods for collecting and analyzing data on unintended outcomes remain under-utilized. Ripple Effects Mapping (REM) is a promising qualitative evaluation method with a wide range of program planning and evaluation applications. In situations where program results are likely to occur over time within complex settings, this method is useful for uncovering both intended and unintended outcomes. REM applies an Appreciative Inquiry facilitation technique to engage stakeholders in visually mapping sequences of program outcomes. Although it has been used to evaluate community development and health promotion initiatives, further methodological guidance for applying REM is still needed. The purpose of this paper is to contribute to the methodological development of evaluating unintended outcomes and extend the foundations of REM by describing steps for integrating it with grounded theory.

Suggested Citation

  • Peterson, Christina & Skolits, Gary, 2019. "Evaluating unintended program outcomes through Ripple Effects Mapping (REM): Application of REM using grounded theory," Evaluation and Program Planning, Elsevier, vol. 76(C), pages 1-1.
  • Handle: RePEc:eee:epplan:v:76:y:2019:i:c:2
    DOI: 10.1016/j.evalprogplan.2019.101677
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718919300072
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2019.101677?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Sherrill, Sam, 1984. "Identifying and measuring unintended outcomes," Evaluation and Program Planning, Elsevier, vol. 7(1), pages 27-34, January.
    2. Rachel Welborn & Laura Downey & Patricia Hyjer Dyk & Pamela A. Monroe & Crystal Tyler-Mackey & Sheri L. Worthy, 2016. "Turning the Tide on Poverty: Documenting impacts through Ripple Effect Mapping," Community Development, Taylor & Francis Journals, vol. 47(3), pages 385-402, July.
    3. Eric P. S. Baumer & David Mimno & Shion Guha & Emily Quan & Geri K. Gay, 2017. "Comparing grounded theory and topic modeling: Extreme divergence or unlikely convergence?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(6), pages 1397-1410, June.
    4. Rick Davies, 2018. "Representing theories of change: technical challenges with evaluation consequences," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 10(4), pages 438-461, October.
    5. Carol H. Weiss, 1997. "How Can Theory-Based Evaluation Make Greater Headway?," Evaluation Review, , vol. 21(4), pages 501-524, August.
    6. Jabeen, Sumera, 2016. "Do we really care about unintended outcomes? An analysis of evaluation theory and practice," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 144-154.
    7. Thayer, Colette E. & Fine, Allison H., 2001. "Evaluation and outcome measurement in the non-profit sector: stakeholder participation," Evaluation and Program Planning, Elsevier, vol. 24(1), pages 103-108, February.
    8. Bamberger, Michael & Tarsilla, Michele & Hesse-Biber, Sharlene, 2016. "Why so many “rigorous” evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 155-162.
    9. Patton, Michael Quinn & Horton, Douglas, 2008. "Utilization-focused evaluation for agricultural innovation," ILAC Briefs 52533, Institutional Learning and Change (ILAC) Initiative.
    10. Joanna Coast, 1999. "The appropriate uses of qualitative methods in health economics," Health Economics, John Wiley & Sons, Ltd., vol. 8(4), pages 345-353, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jabeen, Sumera, 2018. "Unintended outcomes evaluation approach: A plausible way to evaluate unintended outcomes of social development programmes," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 262-274.
    2. Jabeen, Sumera, 2016. "Do we really care about unintended outcomes? An analysis of evaluation theory and practice," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 144-154.
    3. Ofek, Yuval, 2017. "Evaluating social exclusion interventions in university-community partnerships," Evaluation and Program Planning, Elsevier, vol. 60(C), pages 46-55.
    4. Koch, Dirk-Jan & Schulpen, Lau, 2018. "Introduction to the special issue ‘unintended effects of international cooperation’," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 202-209.
    5. de Alteriis, Martin, 2020. "What can we learn about unintended consequences from a textual analysis of monitoring reports and evaluations for U.S. foreign assistance programs?," Evaluation and Program Planning, Elsevier, vol. 79(C).
    6. Smith, Jonathan D., 2017. "Positioning Missionaries in Development Studies, Policy, and Practice," World Development, Elsevier, vol. 90(C), pages 63-76.
    7. Davidson, Angus Alexander & Young, Michael Denis & Leake, John Espie & O’Connor, Patrick, 2022. "Aid and forgetting the enemy: A systematic review of the unintended consequences of international development in fragile and conflict-affected situations," Evaluation and Program Planning, Elsevier, vol. 92(C).
    8. Schuster, Roseanne C. & Brewis, Alexandra & Wutich, Amber & Safi, Christelle & Vanrespaille, Teresa Elegido & Bowen, Gina & SturtzSreetharan, Cindi & McDaniel, Anne & Ochandarena, Peggy, 2023. "Individual interviews versus focus groups for evaluations of international development programs: Systematic testing of method performance to elicit sensitive information in a justice study in Haiti," Evaluation and Program Planning, Elsevier, vol. 97(C).
    9. Florio, Massimo & Graeme, Brad & Astbury, Philip & Armstrong, Harvey W. & Audretsch, David B. & Dermastia, Mateja & Picciotto, Robert & Delponte, Laura & Rampton, James & Sartori, Davide & Vignetti, S, 2016. "Support to SMEs - Increasing research and innovation in SMEs and SME development. Final report. Work package 2," ZEW Expertises, ZEW - Leibniz Centre for European Economic Research, number 141310.
    10. von dem Knesebeck, Olaf & Joksimovic, Ljiljana & Badura, Bernhard & Siegrist, Johannes, 2002. "Evaluation of a community-level health policy intervention," Health Policy, Elsevier, vol. 61(1), pages 111-122, July.
    11. Abdul-Manan, Amir F.N. & Baharuddin, Azizan & Chang, Lee Wei, 2015. "Application of theory-based evaluation for the critical analysis of national biofuel policy: A case study in Malaysia," Evaluation and Program Planning, Elsevier, vol. 52(C), pages 39-49.
    12. Romero-Gutierrez, Miguel & Jimenez-Liso, M. Rut & Martinez-Chico, Maria, 2016. "SWOT analysis to evaluate the programme of a joint online/onsite master's degree in environmental education through the students’ perceptions," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 41-49.
    13. Martens, Krystin S.R., 2018. "How program evaluators use and learn to use rubrics to make evaluative reasoning explicit," Evaluation and Program Planning, Elsevier, vol. 69(C), pages 25-32.
    14. Konrad Obermann & Jasper Scheppe & Bernd Glazinski, 2013. "More Than Figures? Qualitative Research In Health Economics," Health Economics, John Wiley & Sons, Ltd., vol. 22(3), pages 253-257, March.
    15. Massey, Oliver T., 2011. "A proposed model for the analysis and interpretation of focus groups in evaluation research," Evaluation and Program Planning, Elsevier, vol. 34(1), pages 21-28, February.
    16. Telch, Fabian, 2025. "Understanding how national development planning (NDP) shapes public institutions and procedures for development: the case of Colombia," World Development Perspectives, Elsevier, vol. 39(C).
    17. Harris, Kevin & Adams, Andrew, 2016. "Power and discourse in the politics of evidence in sport for development," Sport Management Review, Elsevier, vol. 19(2), pages 97-106.
    18. Gaia Vitrano & Guido J. L. Micheli & Francesca Marazzini & Valeria Panio & Angelo Castaldo & Alessia Marrocco & Stefano Signorini & Alessandro Marinaccio, 2024. "Examining the Complex Interaction Among Technological Innovation, Company Performance, and Occupational Safety and Health: A Mixed-Methods Study," IJERPH, MDPI, vol. 21(10), pages 1-17, October.
    19. Hart, Diane & Paucar-Caceres, Alberto, 2017. "A utilisation focussed and viable systems approach for evaluating technology supported learning," European Journal of Operational Research, Elsevier, vol. 259(2), pages 626-641.
    20. Bourgeois, Isabelle & Whynot, Jane, 2018. "The influence of evaluation recommendations on instrumental and conceptual uses: A preliminary analysis," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 13-18.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:76:y:2019:i:c:2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.