IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v47y2023i2p209-230.html
   My bibliography  Save this article

2020 Rossi Award Lecture: The Evolving Art of Program Evaluation

Author

Listed:
  • Randall S. Brown

Abstract

Evaluation of public programs has undergone many changes over the past four decades since Peter Rossi coined his “Iron Law†of program evaluation: “The expected value of any net impact assessment of any large-scale social program is zero.†While that assessment may be somewhat overstated, the essence still holds. The failures far outnumber the successes, and the estimated favorable effects are rarely sizeable. Despite this grim assessment, much can be learned from “failed†experiments, and from ones that are successful in only some sites or subgroups. Advances in study design, statistical models, data, and how inferences are drawn from estimates have substantially improved our analyses and will continue to do so. However, the most actual learning about “what works†(and why, when, and where) is likely to come from gathering more detailed and comprehensive data on how the intervention was implemented and attempting to link that data to estimated impacts. Researchers need detailed data on the target population served, the content of the intervention, and the process by which it is delivered to participating service providers and individuals. Two examples presented here illustrate how researchers drew useful broader lessons from impact estimates for a set of related programs. Rossi posited three reasons most interventions fail—wrong question, wrong intervention, poor implementation. Speeding the accumulation of wisdom about how social programs can best help vulnerable populations will require that researchers work closely with program funders, developers, operators, and participants to gather and interpret these detailed data about program implementation.

Suggested Citation

  • Randall S. Brown, 2023. "2020 Rossi Award Lecture: The Evolving Art of Program Evaluation," Evaluation Review, , vol. 47(2), pages 209-230, April.
  • Handle: RePEc:sae:evarev:v:47:y:2023:i:2:p:209-230
    DOI: 10.1177/0193841X221121241
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X221121241
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X221121241?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    2. Deborah Peikes & Arnold Chen & Jennifer Schore & Randall Brown, 2009. "Effects of Care Coordination on Hospitalization, Quality of Care, and Health Care Expenditures Among Medicare Beneficiaries: 15 Randomized Trials," Mathematica Policy Research Reports ce70f11be1b44e2c8590b9cf5, Mathematica Policy Research.
    3. repec:mpr:mprres:6184 is not listed on IDEAS
    4. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    5. Angus Deaton, 2020. "Randomization in the Tropics Revisited: a Theme and Eleven Variations," NBER Working Papers 27600, National Bureau of Economic Research, Inc.
    6. repec:mpr:mprres:7472 is not listed on IDEAS
    7. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    8. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    2. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    3. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    4. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2020. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics (RM/19/029-revised-)," Research Memorandum 014, Maastricht University, Graduate School of Business and Economics (GSBE).
    5. Doucouliagos, Hristos & Paldam, Martin & Stanley, T.D., 2018. "Skating on thin evidence: Implications for public policy," European Journal of Political Economy, Elsevier, vol. 54(C), pages 16-25.
    6. Vellore Arthi & James Fenske, 2018. "Polygamy and child mortality: Historical and modern evidence from Nigeria’s Igbo," Review of Economics of the Household, Springer, vol. 16(1), pages 97-141, March.
    7. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    8. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    9. Bravo-Ureta, Boris E. & Higgins, Daniel & Arslan, Aslihan, 2020. "Irrigation infrastructure and farm productivity in the Philippines: A stochastic Meta-Frontier analysis," World Development, Elsevier, vol. 135(C).
    10. Ogundari, Kolawole, 2021. "A systematic review of statistical methods for estimating an education production function," MPRA Paper 105283, University Library of Munich, Germany.
    11. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    12. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    13. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    14. Giovanni Dosi, 2022. "The Agenda for Evolutionary Economics: Results, Dead Ends, and Challenges Ahead," LEM Papers Series 2022/24, Laboratory of Economics and Management (LEM), Sant'Anna School of Advanced Studies, Pisa, Italy.
    15. Lota Tamini & Ibrahima Bocoum & Ghislain Auger & Kotchikpa Gabriel Lawin & Arahama Traoré, 2019. "Enhanced Microfinance Services and Agricultural Best Management Practices: What Benefits for Smallholders Farmers? An Evidence from Burkina Faso," CIRANO Working Papers 2019s-11, CIRANO.
    16. James Mahoney & Andrew Owen, 2022. "Importing set-theoretic tools into quantitative research: the case of necessary and sufficient conditions," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(4), pages 2001-2022, August.
    17. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2019. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics," Research Memorandum 029, Maastricht University, Graduate School of Business and Economics (GSBE).
    18. Brodeur, Abel & Cook, Nikolai & Neisser, Carina, 2022. "P-Hacking, Data Type and Data-Sharing Policy," IZA Discussion Papers 15586, Institute of Labor Economics (IZA).
    19. Asatryan, Zareh & Havlik, Annika & Heinemann, Friedrich & Nover, Justus, 2020. "Biases in fiscal multiplier estimates," European Journal of Political Economy, Elsevier, vol. 63(C).
    20. George W. Norton, 2020. "Lessons from a Career in Agricultural Development and Research Evaluation," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 42(2), pages 151-167, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:47:y:2023:i:2:p:209-230. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.