IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v47y2023i3p563-593.html
   My bibliography  Save this article

Can Non-Randomised Studies of Interventions Provide Unbiased Effect Estimates? A Systematic Review of Internal Replication Studies

Author

Listed:
  • Hugh Sharma Waddington
  • Paul Fenton Villar
  • Jeffrey C. Valentine

Abstract

Non-randomized studies of intervention effects (NRS), also called quasi-experiments, provide useful decision support about development impacts. However, the assumptions underpinning them are usually untestable, their verification resting on empirical replication. The internal replication study aims to do this by comparing results from a causal benchmark study, usually a randomized controlled trial (RCT), with those from an NRS conducted at the same time in the sampled population. We aimed to determine the credibility and generalizability of findings in internal replication studies in development economics, through a systematic review and meta-analysis. We systematically searched for internal replication studies of RCTs conducted on socioeconomic interventions in low- and middle-income countries. We critically appraised the benchmark randomized studies, using an adapted tool. We extracted and statistically synthesized empirical measures of bias. We included 600 estimates of correspondence between NRS and benchmark RCTs. All internal replication studies were found to have at least “some concerns†about bias and some had high risk of bias. We found that study designs with selection on unobservables, in particular regression discontinuity, on average produced absolute standardized bias estimates that were approximately zero, that is, equivalent to the estimates produced by RCTs. But study conduct also mattered. For example, matching using pre-tests and nearest neighbor algorithms corresponded more closely to the benchmarks. The findings from this systematic review confirm that NRS can produce unbiased estimates. Authors of internal replication studies should publish pre-analysis protocols to enhance their credibility.

Suggested Citation

  • Hugh Sharma Waddington & Paul Fenton Villar & Jeffrey C. Valentine, 2023. "Can Non-Randomised Studies of Interventions Provide Unbiased Effect Estimates? A Systematic Review of Internal Replication Studies," Evaluation Review, , vol. 47(3), pages 563-593, June.
  • Handle: RePEc:sae:evarev:v:47:y:2023:i:3:p:563-593
    DOI: 10.1177/0193841X221116721
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X221116721
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X221116721?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Willa Friedman & Michael Kremer & Edward Miguel & Rebecca Thornton, 2016. "Education as Liberation?," Economica, London School of Economics and Political Science, vol. 83(329), pages 1-30, January.
    2. Galiani, Sebastian & McEwan, Patrick J., 2013. "The heterogeneous impact of conditional cash transfers," Journal of Public Economics, Elsevier, vol. 103(C), pages 85-96.
    3. repec:mpr:mprres:6372 is not listed on IDEAS
    4. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    5. repec:mpr:mprres:3694 is not listed on IDEAS
    6. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    7. David S. Lee & Thomas Lemieux, 2010. "Regression Discontinuity Designs in Economics," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 281-355, June.
    8. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    9. David McKenzie & John Gibson & Steven Stillman, 2010. "How Important Is Selection? Experimental vs. Non-Experimental Measures of the Income Gains from Migration," Journal of the European Economic Association, MIT Press, vol. 8(4), pages 913-945, June.
    10. Matias Busso & John DiNardo & Justin McCrary, 2014. "New Evidence on the Finite Sample Properties of Propensity Score Reweighting and Matching Estimators," The Review of Economics and Statistics, MIT Press, vol. 96(5), pages 885-897, December.
    11. Luis Rubalcava & Graciela Teruel & Duncan Thomas, 2009. "Investments, Time Preferences, and Public Transfers Paid to Women," Economic Development and Cultural Change, University of Chicago Press, vol. 57(3), pages 507-538, April.
    12. Maluccio, John A. & Flores, Rafael, 2005. "Impact evaluation of a conditional cash transfer program: the Nicaraguan Red de Protección Social," Research reports 141, International Food Policy Research Institute (IFPRI).
    13. Shayda Mae Sabet & Annette N. Brown, 2018. "Is impact evaluation still on the rise? The new trends in 2010–2015," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 10(3), pages 291-304, July.
    14. Peter Z. Schochet, "undated". "Technical Methods Report: Statistical Power for Regression Discontinuity Designs in Education Evaluations," Mathematica Policy Research Reports 61fb6c057561451a8a6074508, Mathematica Policy Research.
    15. Buddelmeyer, Hielke & Skoufias, Emmanuel, 2003. "An Evaluation of the Performance of Regression Discontinuity Design on PROGRESA," IZA Discussion Papers 827, Institute of Labor Economics (IZA).
    16. Sudhanshu Handa & John A. Maluccio, 2010. "Matching the Gold Standard: Comparing Experimental and Nonexperimental Evaluation Techniques for a Geographically Targeted Program," Economic Development and Cultural Change, University of Chicago Press, vol. 58(3), pages 415-447, April.
    17. Felipe Barrera-Osorio & Deon Filmer, 2016. "Incentivizing Schooling for Learning: Evidence on the Impact of Alternative Targeting Approaches," Journal of Human Resources, University of Wisconsin Press, vol. 51(2), pages 461-499.
    18. Glewwe, Paul & Kremer, Michael & Moulin, Sylvie & Zitzewitz, Eric, 2004. "Retrospective vs. prospective analyses of school inputs: the case of flip charts in Kenya," Journal of Development Economics, Elsevier, vol. 74(1), pages 251-268, June.
    19. Jose Urquieta & Gustavo Angeles & Thomas Mroz & Hector Lamadrid-Figueroa & Bernardo Hernández, 2009. "Impact of Oportunidades on Skilled Attendance at Delivery in Rural Areas," Economic Development and Cultural Change, University of Chicago Press, vol. 57(3), pages 539-558, April.
    20. Angelucci, Manuela & De Giorgi, Giacomo, 2006. "Indirect Effects of an Aid Program: The Case of Progresa and Consumption," IZA Discussion Papers 1955, Institute of Labor Economics (IZA).
    21. Hector Lamadrid-Figueroa & Gustavo Angeles & Thomas Mroz & Jose Urquieta-Salomon & Bernardo Hernandez-Prado & Aurelio Cruz-Valdez & Martha Tellez-Rojo, 2010. "Heterogeneous impact of the social programme Oportunidades on use of contraceptive methods by young adult women living in rural areas," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 2(1), pages 74-86.
    22. Paul Fenton Villar & Hugh Waddington, 2019. "Within study comparisons and risk of bias in international development: Systematic review and critical appraisal," Campbell Systematic Reviews, John Wiley & Sons, vol. 15(1-2), June.
    23. Skoufias, Emmanuel & Davis, Benjamin & de la Vega, Sergio, 2001. "Targeting the Poor in Mexico: An Evaluation of the Selection of Households into PROGRESA," World Development, Elsevier, vol. 29(10), pages 1769-1784, October.
    24. Juan Jose Diaz & Sudhanshu Handa, 2006. "An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico’s PROGRESA Program," Journal of Human Resources, University of Wisconsin Press, vol. 41(2).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Henrik Hansen & Ninja Ritter Klejnstrup & Ole Winckler Andersen, 2011. "A Comparison of Model-based and Design-based Impact Evaluations of Interventions in Developing Countries," IFRO Working Paper 2011/16, University of Copenhagen, Department of Food and Resource Economics.
    2. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    3. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    4. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    5. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    6. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    7. Solomon Asfaw & Silvio Daidone & Benjamin Davis & Josh Dewbre & Alessandro Romeo & Paul Winters & Katia Covarrubias & Habiba Djebbari, 2012. "Analytical Framework for Evaluating the Productive Impact of Cash Transfer Programmes on Household Behaviour – Methodological Guidelines for the From Protection to Production Project," Working Papers 101, International Policy Centre for Inclusive Growth.
    8. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," NBER Working Papers 26080, National Bureau of Economic Research, Inc.
    9. Anders Stenberg & Xavier Luna & Olle Westerlund, 2014. "Does Formal Education for Older Workers Increase Earnings? — Evidence Based on Rich Data and Long-term Follow-up," LABOUR, CEIS, vol. 28(2), pages 163-189, June.
    10. Michael Clemens & Erwin Tiongson, 2012. "Split Decisions: Family finance when a policy discontinuity allocates overseas work," RF Berlin - CReAM Discussion Paper Series 1234, Rockwool Foundation Berlin (RF Berlin) - Centre for Research and Analysis of Migration (CReAM).
    11. Robin Jacob & Marie-Andree Somers & Pei Zhu & Howard Bloom, 2016. "The Validity of the Comparative Interrupted Time Series Design for Evaluating the Effect of School-Level Interventions," Evaluation Review, , vol. 40(3), pages 167-198, June.
    12. Daniel Zaga, 2014. "The Impact of Three Mexican Nutritional Programs: The Case of Dif-Puebla," CFD Working Papers 09-2015, Centre for Finance and Development, The Graduate Institute.
    13. Anders Stenberg & Olle Westerlund, 2015. "The long-term earnings consequences of general vs. specific training of the unemployed," IZA Journal of European Labor Studies, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-26, December.
    14. Melba V. Tutor, 2014. "The Impact of Philippines’ Conditional Cash Transfer Program on Consumption," UP School of Economics Discussion Papers 201405, University of the Philippines School of Economics.
    15. Peter M. Steiner, 2011. "Propensity Score Methods for Causal Inference: On the Relative Importance of Covariate Selection, Reliable Measurement, and Choice of Propensity Score Technique," Working Papers 09, AlmaLaurea Inter-University Consortium.
    16. Black, Dan A. & Joo, Joonhwi & LaLonde, Robert & Smith, Jeffrey A. & Taylor, Evan J., 2022. "Simple Tests for Selection: Learning More from Instrumental Variables," Labour Economics, Elsevier, vol. 79(C).
    17. Jere Behrman & Susan Parker, 2013. "Is Health of the Aging Improved by Conditional Cash Transfer Programs? Evidence From Mexico," Demography, Springer;Population Association of America (PAA), vol. 50(4), pages 1363-1386, August.
    18. Chakravarty, Shubha & Lundberg, Mattias & Nikolov, Plamen & Zenker, Juliane, 2019. "Vocational training programs and youth labor market outcomes: Evidence from Nepal," Journal of Development Economics, Elsevier, vol. 136(C), pages 71-110.
    19. Melba V. Tutor, 2014. "The impact of the PhilippinesÕ conditional cash transfer program on consumption," Philippine Review of Economics, University of the Philippines School of Economics and Philippine Economic Society, vol. 51(1), pages 117-161, June.
    20. Halldén, Karin & Stenberg, Anders, 2013. "The Relationship between Hours of Domestic Services and Female Earnings: Panel Register Data Evidence from a Reform," Working Paper Series 4/2013, Stockholm University, Swedish Institute for Social Research.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:47:y:2023:i:3:p:563-593. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.