IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v21y1997i4p501-524.html
   My bibliography  Save this article

How Can Theory-Based Evaluation Make Greater Headway?

Author

Listed:
  • Carol H. Weiss

    (Harvard University)

Abstract

The idea of theory-based evaluation (TBE) is plausible and cogent, and it promises to bring greater explanatory power to evaluation. However, problems beset its use, including inadequate theories about pathways to desired outcomes in many program areas, confusion between theories of implementation and theories of programmatic action, difficulties in eliciting or constructing usable theories, measurement error, complexities in analysis, and others. This article explores the problems, describes the nature of potential benefits, and suggests that the benefits are significant enough to warrant continued effort to overcome the obstacles and advance the feasibility of TBE.

Suggested Citation

  • Carol H. Weiss, 1997. "How Can Theory-Based Evaluation Make Greater Headway?," Evaluation Review, , vol. 21(4), pages 501-524, August.
  • Handle: RePEc:sae:evarev:v:21:y:1997:i:4:p:501-524
    DOI: 10.1177/0193841X9702100405
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X9702100405
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X9702100405?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Lipsey, Mark W. & Pollard, John A., 1989. "Driving toward theory in program evaluation: More models to choose from," Evaluation and Program Planning, Elsevier, vol. 12(4), pages 317-328, January.
    2. Patton, Michael Quinn, 1989. "A context and boundaries for a theory-driven approach to validity," Evaluation and Program Planning, Elsevier, vol. 12(4), pages 375-377, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Peterson, Christina & Skolits, Gary, 2019. "Evaluating unintended program outcomes through Ripple Effects Mapping (REM): Application of REM using grounded theory," Evaluation and Program Planning, Elsevier, vol. 76(C), pages 1-1.
    2. Ofek, Yuval, 2017. "Evaluating social exclusion interventions in university-community partnerships," Evaluation and Program Planning, Elsevier, vol. 60(C), pages 46-55.
    3. Goldberg, Jessica & Bumgarner, Erin & Jacobs, Francine, 2016. "Measuring program- and individual-level fidelity in a home visiting program for adolescent parents," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 163-173.
    4. Harman, Elena & Azzam, Tarek, 2018. "Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences," Evaluation and Program Planning, Elsevier, vol. 66(C), pages 183-194.
    5. Fred Coalter, 2015. "Sport-for-Change: Some Thoughts from a Sceptic," Social Inclusion, Cogitatio Press, vol. 3(3), pages 19-23.
    6. Jabeen, Sumera, 2016. "Do we really care about unintended outcomes? An analysis of evaluation theory and practice," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 144-154.
    7. Fred Coalter, 2017. "Sport and Social Inclusion: Evidence-Based Policy and Practice," Social Inclusion, Cogitatio Press, vol. 5(2), pages 141-149.
    8. Rob Tulder & M. May Seitanidi & Andrew Crane & Stephen Brammer, 2016. "Enhancing the Impact of Cross-Sector Partnerships," Journal of Business Ethics, Springer, vol. 135(1), pages 1-17, April.
    9. Harris, Kevin & Adams, Andrew, 2016. "Power and discourse in the politics of evidence in sport for development," Sport Management Review, Elsevier, vol. 19(2), pages 97-106.
    10. Massimo FLORIO & Aleksandra PARTEKA & Emanuela SIRTORI, 2016. "The Role of EU Policy in Supporting Technological Innovation in SMEs - a Bayesian Network Analysis of Firm-Level Data from Poland," Departmental Working Papers 2016-13, Department of Economics, Management and Quantitative Methods at Università degli Studi di Milano.
    11. Florio, Massimo & Graeme, Brad & Astbury, Philip & Armstrong, Harvey W. & Audretsch, David B. & Dermastia, Mateja & Picciotto, Robert & Delponte, Laura & Rampton, James & Sartori, Davide & Vignetti, S, 2016. "Support to SMEs - Increasing research and innovation in SMEs and SME development. Final report. Work package 2," ZEW Expertises, ZEW - Leibniz Centre for European Economic Research, number 141310.
    12. Armstrong, Natalie & Brewster, Liz & Tarrant, Carolyn & Dixon, Ruth & Willars, Janet & Power, Maxine & Dixon-Woods, Mary, 2018. "Taking the heat or taking the temperature? A qualitative study of a large-scale exercise in seeking to measure for improvement, not blame," Social Science & Medicine, Elsevier, vol. 198(C), pages 157-164.
    13. Helitzer, Deborah L. & Sussman, Andrew L. & Hoffman, Richard M. & Getrich, Christina M. & Warner, Teddy D. & Rhyne, Robert L., 2014. "Along the way to developing a theory of the program: A re-examination of the conceptual framework as an organizing strategy," Evaluation and Program Planning, Elsevier, vol. 45(C), pages 157-163.
    14. Nogueira-Jr, Cassimiro & Padoveze, Maria Clara, 2018. "Public policies on healthcare associated infections: A case study of three countries," Health Policy, Elsevier, vol. 122(9), pages 991-1000.
    15. Hart, Diane & Paucar-Caceres, Alberto, 2017. "A utilisation focussed and viable systems approach for evaluating technology supported learning," European Journal of Operational Research, Elsevier, vol. 259(2), pages 626-641.
    16. Khembo, Felix & Chapman, Sarah, 2017. "A formative evaluation of the recovery public works programme in Blantyre City, Malawi," Evaluation and Program Planning, Elsevier, vol. 61(C), pages 8-21.
    17. Jabeen, Sumera, 2018. "Unintended outcomes evaluation approach: A plausible way to evaluate unintended outcomes of social development programmes," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 262-274.
    18. von dem Knesebeck, Olaf & Joksimovic, Ljiljana & Badura, Bernhard & Siegrist, Johannes, 2002. "Evaluation of a community-level health policy intervention," Health Policy, Elsevier, vol. 61(1), pages 111-122, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kent, Douglas R. & Donaldson, Stewart I. & Wyrick, Phelan A. & Smith, Peggy J., 2000. "Evaluating criminal justice programs designed to reduce crime by targeting repeat gang offenders," Evaluation and Program Planning, Elsevier, vol. 23(1), pages 115-124, February.
    2. Park, Chul Hyun & Welch, Eric W. & Sriraj, P.S., 2016. "An integrative theory-driven framework for evaluating travel training programs," Evaluation and Program Planning, Elsevier, vol. 59(C), pages 7-20.
    3. Anthony Petrosino, 2000. "Mediators and Moderators in the Evaluation of Programs for Children," Evaluation Review, , vol. 24(1), pages 47-72, February.
    4. Salvador Moscoso & Francisco Tello & José López, 2006. "Using Generalizability Theory to Assess the Validity of the Evaluation Process," Quality & Quantity: International Journal of Methodology, Springer, vol. 40(3), pages 315-329, June.
    5. Asghari, Shabnam & Heeley, Thomas & Bethune, Cheri & Graham, Wendy & MacLellan, Cameron & Button, Cathryn & Porter, Nicole & Parsons, Sandra, 2021. "Evaluation plan of the 6for6 research skills program for rural and remote physicians," Evaluation and Program Planning, Elsevier, vol. 87(C).
    6. Geoffrey J. Syme & Brian S. Sadler, 1994. "Evaluation of Public Involvement in Water Resources Planning," Evaluation Review, , vol. 18(5), pages 523-542, October.
    7. Kalpazidou Schmidt, Evanthia & Graversen, Ebbe Krogh, 2020. "Developing a conceptual evaluation framework for gender equality interventions in research and innovation," Evaluation and Program Planning, Elsevier, vol. 79(C).
    8. Victoria A. Johnson & Kevin R. Ronan & David M. Johnston & Robin Peace, 2016. "Improving the Impact and Implementation of Disaster Education: Programs for Children Through Theory‐Based Evaluation," Risk Analysis, John Wiley & Sons, vol. 36(11), pages 2120-2135, November.
    9. Nesman, Teresa M. & Batsche, Catherine & Hernandez, Mario, 2007. "Theory-based evaluation of a comprehensive Latino education initiative: An interactive evaluation approach," Evaluation and Program Planning, Elsevier, vol. 30(3), pages 267-281, August.
    10. Downes, Jenni & Gullickson, Amy M., 2022. "What does it mean for an evaluation to be ‘valid’? A critical synthesis of evaluation literature," Evaluation and Program Planning, Elsevier, vol. 91(C).
    11. Goodier, Sarah & Field, Carren & Goodman, Suki, 2018. "The need for theory evaluation in global citizenship programmes: The case of the GCSA programme," Evaluation and Program Planning, Elsevier, vol. 66(C), pages 7-19.
    12. Gravel, Jason & Bouchard, Martin & Descormiers, Karine & Wong, Jennifer S. & Morselli, Carlo, 2013. "Keeping promises: A systematic review and a new classification of gang control strategies," Journal of Criminal Justice, Elsevier, vol. 41(4), pages 228-242.
    13. Jeremiah, Rohan D. & Quinn, Camille R. & Alexis, Jicinta M., 2018. "Lessons learned: Evaluating the program fidelity of UNWomen Partnership for Peace domestic violence diversion program in the Eastern Caribbean," Evaluation and Program Planning, Elsevier, vol. 69(C), pages 61-67.
    14. Brousselle, Astrid & Champagne, François, 2011. "Program theory evaluation: Logic analysis," Evaluation and Program Planning, Elsevier, vol. 34(1), pages 69-78, February.
    15. Sowl, Stephanie & Amrein-Beardsley, Audrey & Collins, Clarin, 2022. "Teaching program evaluation: How blending theory and practice enhance student-evaluator competencies in an education policy graduate program," Evaluation and Program Planning, Elsevier, vol. 94(C).
    16. Nichols, Laura, 2002. "Participatory program planning: including program participants and evaluators," Evaluation and Program Planning, Elsevier, vol. 25(1), pages 1-14, February.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:21:y:1997:i:4:p:501-524. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.