IDEAS home Printed from https://ideas.repec.org/a/eee/jeborg/v107y2014ipap344-365.html
   My bibliography  Save this article

The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark

Author

Listed:
  • Ferraro, Paul J.
  • Miranda, Juan José

Abstract

In the field of environmental policy, randomized evaluation designs are rare. Thus researchers typically rely on observational designs to evaluate program impacts. To assess the ability of observational designs to replicate the results of experimental designs, researchers use design-replication studies. In our design-replication study, we use data from a large-scale, randomized field experiment that tested the effectiveness of norm-based messages designed to induce voluntary reductions in water use. We attempt to replicate the experimental results using a nonrandomized comparison group and statistical techniques to eliminate or mitigate observable and unobservable sources of bias. In a companion study, Ferraro and Miranda (2013a) replicate the experimental estimates by following best practices to select a non-experimental control group, by using a rich data set on observable characteristics that includes repeated pre- and post-treatment outcome measures, and by combining panel data methods and matching designs. We assess whether non-experimental designs continue to replicate the experimental benchmark when the data are far less rich, as is often the case in environmental policy evaluation. Trimming and inverse probability weighting and simple difference-in-differences designs perform poorly. Pre-processing the data by matching and then estimating the treatment effect with ordinary least squares (OLS) regression performs best, but a bootstrapping exercise suggests the performance can be sensitive to the sample (yet far less sensitive than OLS without pre-processing).

Suggested Citation

  • Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
  • Handle: RePEc:eee:jeborg:v:107:y:2014:i:pa:p:344-365
    DOI: 10.1016/j.jebo.2014.03.008
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S016726811400078X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jebo.2014.03.008?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    2. Card, David & Krueger, Alan B, 1994. "Minimum Wages and Employment: A Case Study of the Fast-Food Industry in New Jersey and Pennsylvania," American Economic Review, American Economic Association, vol. 84(4), pages 772-793, September.
    3. Huber, Martin & Lechner, Michael & Wunsch, Conny, 2010. "How to Control for Many Covariates? Reliable Estimators Based on the Propensity Score," IZA Discussion Papers 5268, Institute of Labor Economics (IZA).
    4. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    5. Richard Emsley & Mark Lunt & Andrew Pickles & GraHam Dunn, 2008. "Implementing double-robust estimators of causal effects," Stata Journal, StataCorp LP, vol. 8(3), pages 334-353, September.
    6. Shadish, William R. & Clark, M. H. & Steiner, Peter M., 2008. "Can Nonrandomized Experiments Yield Accurate Answers? A Randomized Experiment Comparing Random and Nonrandom Assignments," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1334-1344.
    7. Ian McCarthy & Daniel Millimet & Rusty Tchernis, 2014. "The bmte command: Methods for the estimation of treatment effects when exclusion restrictions are unavailable," Stata Journal, StataCorp LP, vol. 14(3), pages 670-683, September.
    8. Keisuke Hirano & Guido W. Imbens & Geert Ridder, 2003. "Efficient Estimation of Average Treatment Effects Using the Estimated Propensity Score," Econometrica, Econometric Society, vol. 71(4), pages 1161-1189, July.
    9. Jinyong Hahn, 1998. "On the Role of the Propensity Score in Efficient Semiparametric Estimation of Average Treatment Effects," Econometrica, Econometric Society, vol. 66(2), pages 315-332, March.
    10. David McKenzie & John Gibson & Steven Stillman, 2010. "How Important Is Selection? Experimental vs. Non-Experimental Measures of the Income Gains from Migration," Journal of the European Economic Association, MIT Press, vol. 8(4), pages 913-945, June.
    11. Sebastian Galiani & Paul Gertler & Ernesto Schargrodsky, 2005. "Water for Life: The Impact of the Privatization of Water Services on Child Mortality," Journal of Political Economy, University of Chicago Press, vol. 113(1), pages 83-120, February.
    12. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    13. Timothy Besley & Robin Burgess, 2004. "Can Labor Regulation Hinder Economic Performance? Evidence from India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 119(1), pages 91-134.
    14. Ravallion, Martin, 2008. "Evaluating Anti-Poverty Programs," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 59, pages 3787-3846, Elsevier.
    15. James Heckman & Hidehiko Ichimura & Jeffrey Smith & Petra Todd, 1998. "Characterizing Selection Bias Using Experimental Data," Econometrica, Econometric Society, vol. 66(5), pages 1017-1098, September.
    16. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    17. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
    18. Roberto Agodini & Mark Dynarski, "undated". "Are Experiments the Only Option? A Look at Dropout Prevention Programs," Mathematica Policy Research Reports 51241adbf9fa4a26add6d54c5, Mathematica Policy Research.
    19. David H. Greenberg & Charles Michalopoulos & Philip K. Robin, 2006. "Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(3), pages 523-552.
    20. Ho, Daniel E. & Imai, Kosuke & King, Gary & Stuart, Elizabeth A., 2007. "Matching as Nonparametric Preprocessing for Reducing Model Dependence in Parametric Causal Inference," Political Analysis, Cambridge University Press, vol. 15(3), pages 199-236, July.
    21. Juan Jose Diaz & Sudhanshu Handa, 2006. "An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico’s PROGRESA Program," Journal of Human Resources, University of Wisconsin Press, vol. 41(2).
    22. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    23. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    24. Ferraro, Paul J. & Miranda, Juan José, 2013. "Heterogeneous treatment effects and mechanisms in information-based environmental policies: Evidence from a large-scale field experiment," Resource and Energy Economics, Elsevier, vol. 35(3), pages 356-379.
    25. James J. Heckman & Hidehiko Ichimura & Petra E. Todd, 1997. "Matching As An Econometric Evaluation Estimator: Evidence from Evaluating a Job Training Programme," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 64(4), pages 605-654.
    26. Angrist, Joshua D. & Krueger, Alan B., 1999. "Empirical strategies in labor economics," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 23, pages 1277-1366, Elsevier.
    27. Rajeev H. Dehejia & Sadek Wahba, 2002. "Propensity Score-Matching Methods For Nonexperimental Causal Studies," The Review of Economics and Statistics, MIT Press, vol. 84(1), pages 151-161, February.
    28. Sudhanshu Handa & John A. Maluccio, 2010. "Matching the Gold Standard: Comparing Experimental and Nonexperimental Evaluation Techniques for a Geographically Targeted Program," Economic Development and Cultural Change, University of Chicago Press, vol. 58(3), pages 415-447, April.
    29. Roberto Agodini & Mark Dynarski, 2004. "Are Experiments the Only Option? A Look at Dropout Prevention Programs," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 180-194, February.
    30. Paul J. Ferraro & Michael K. Price, 2013. "Using Nonpecuniary Strategies to Influence Behavior: Evidence from a Large-Scale Field Experiment," The Review of Economics and Statistics, MIT Press, vol. 95(1), pages 64-73, March.
    31. Arceneaux, Kevin & Gerber, Alan S. & Green, Donald P., 2006. "Comparing Experimental and Matching Methods Using a Large-Scale Voter Mobilization Experiment," Political Analysis, Cambridge University Press, vol. 14(1), pages 37-62, January.
    32. Sekhon, Jasjeet S., 2011. "Multivariate and Propensity Score Matching Software with Automated Balance Optimization: The Matching package for R," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 42(i07).
    33. Greenstone, Michael & Gayer, Ted, 2009. "Quasi-experimental and experimental approaches to environmental economics," Journal of Environmental Economics and Management, Elsevier, vol. 57(1), pages 21-44, January.
    34. James J. Heckman & Hidehiko Ichimura & Petra Todd, 1998. "Matching As An Econometric Evaluation Estimator," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 65(2), pages 261-294.
    35. Paul J. Ferraro & Juan Jose Miranda & Michael K. Price, 2011. "The Persistence of Treatment Effects with Norm-Based Policy Instruments: Evidence from a Randomized Environmental Policy Experiment," American Economic Review, American Economic Association, vol. 101(3), pages 318-322, May.
    36. Alberto Abadie & Guido W. Imbens, 2006. "Large Sample Properties of Matching Estimators for Average Treatment Effects," Econometrica, Econometric Society, vol. 74(1), pages 235-267, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Brelsford, Christa & Abbott, Joshua K., 2021. "How smart are ‘Water Smart Landscapes’?," Journal of Environmental Economics and Management, Elsevier, vol. 106(C).
    2. Meyer, Maximilian & Klingelhoeffer, Ekkehard & Naidoo, Robin & Wingate, Vladimir & Börner, Jan, 2021. "Tourism opportunities drive woodland and wildlife conservation outcomes of community-based conservation in Namibia's Zambezi region," Ecological Economics, Elsevier, vol. 180(C).
    3. Meyer, Maximilian & Hulke, Carolin & Kamwi, Jonathan & Kolem, Hannah & Börner, Jan, 2022. "Spatially heterogeneous effects of collective action on environmental dependence in Namibia’s Zambezi region," World Development, Elsevier, vol. 159(C).
    4. Michael D. Smith & Dennis Wesselbaum, 2023. "Financial inclusion and international migration in low- and middle-income countries," Empirical Economics, Springer, vol. 65(1), pages 341-370, July.
    5. Dede Long & David Lewis & Christian Langpap, 2021. "Negative Traffic Externalities and Infant Health: The Role of Income Heterogeneity and Residential Sorting," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 80(3), pages 637-674, November.
    6. Ito, Junichi & Feuer, Hart N. & Kitano, Shinichi & Asahi, Haruka, 2019. "Assessing the effectiveness of Japan's community-based direct payment scheme for hilly and mountainous areas," Ecological Economics, Elsevier, vol. 160(C), pages 62-75.
    7. Baggio, Michele & Towe, Charles, 2016. "Evaluating the Effects of Stream Restorations," 2016 Annual Meeting, July 31-August 2, Boston, Massachusetts 235679, Agricultural and Applied Economics Association.
    8. Brett R. Gordon & Florian Zettelmeyer & Neha Bhargava & Dan Chapsky, 2019. "A Comparison of Approaches to Advertising Measurement: Evidence from Big Field Experiments at Facebook," Marketing Science, INFORMS, vol. 38(2), pages 193-225, March.
    9. Meyer, Maximilian & Hulke, Carolin & Kamwi, Jonathan & Kolem, Hannah & Börner, Jan, 2021. "Spatially heterogeneous effects of collective action on environmental dependence in the Kavango-Zambezi Transfrontier Conservation Area," 2021 Conference, August 17-31, 2021, Virtual 315018, International Association of Agricultural Economists.
    10. Alan de Brauw & Valerie Mueller & Tassew Woldehanna, 2018. "Does Internal Migration Improve Overall Well-Being in Ethiopia?," Journal of African Economies, Centre for the Study of African Economies (CSAE), vol. 27(3), pages 367-367.
    11. Brittany Tarufelli & Ben Gilbert, 2019. "Leakage in Regional Climate Policy? Implications of Electricity Market Design," Working Papers 2019-07, Colorado School of Mines, Division of Economics and Business, revised Dec 2021.
    12. Carlianne Patrick & Amanda Ross & Heather Stephens, 2016. "Designing Policies to Spur Economic Growth: How Regional Scientists Can Contribute to Future Policy Development and Evaluation," Working Papers 16-04, Department of Economics, West Virginia University.
    13. Baggio, Michele & Towe, Charles, 2015. "Evaluating the Effects of River and Stream Restorations," 2015 AAEA & WAEA Joint Annual Meeting, July 26-28, San Francisco, California 205561, Agricultural and Applied Economics Association.
    14. Daniel P. Bigelow & Todd Kuethe, 2020. "A Tale of Two Borders: Use‐Value Assessment, Land Development, and Irrigation Investment," American Journal of Agricultural Economics, John Wiley & Sons, vol. 102(5), pages 1404-1424, October.
    15. Bhattacharjee, Arnab & Aravena, Claudia & Castillo, Natalia & Ehrlich, Marco & Taou, Nadia & Wagner, Thomas, 2022. "Agroforestry Programs in the Colombian Amazon: Selection, Treatment and Exposure Effects on Deforestation," National Institute of Economic and Social Research (NIESR) Discussion Papers 537, National Institute of Economic and Social Research.
    16. Christa Brelsford & Joshua K. Abbott, 2018. "How Smart Are `Water Smart Landscapes'?," Papers 1803.04593, arXiv.org.
    17. Vivian C. Wong & Peter M. Steiner, 2018. "Designs of Empirical Evaluations of Nonexperimental Methods in Field Settings," Evaluation Review, , vol. 42(2), pages 176-213, April.
    18. Smith, Michael D. & Floro, Maria S., 2020. "Food insecurity, gender, and international migration in low- and middle-income countries," Food Policy, Elsevier, vol. 91(C).
    19. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    20. Blackman, Allen, 2015. "Strict versus mixed-use protected areas: Guatemala's Maya Biosphere Reserve," Ecological Economics, Elsevier, vol. 112(C), pages 14-24.
    21. Melstrom, Richard T. & Lee, Kangil & Byl, Jacob P., 2018. "Do Regulations to Protect Endangered Species on Private Lands Affect Local Employment? Evidence from the Listing of the Lesser Prairie Chicken," Journal of Agricultural and Resource Economics, Western Agricultural Economics Association, vol. 43(3), September.
    22. Carlianne Patrick, 2016. "Identifying The Local Economic Development Effects Of Million Dollar Facilities," Economic Inquiry, Western Economic Association International, vol. 54(4), pages 1737-1762, October.
    23. Patrick, Carlianne & Mothorpe, Christopher, 2017. "Demand for new cities: Property value capitalization of municipal incorporation," Regional Science and Urban Economics, Elsevier, vol. 67(C), pages 78-89.
    24. Steven M. Smith, 2019. "The Relative Economic Merits of Alternative Water Rights," Working Papers 2019-08, Colorado School of Mines, Division of Economics and Business.
    25. Wichman, Casey J. & Ferraro, Paul J., 2017. "A cautionary tale on using panel data estimators to measure program impacts," Economics Letters, Elsevier, vol. 151(C), pages 82-90.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    2. Flores, Carlos A. & Mitnik, Oscar A., 2009. "Evaluating Nonexperimental Estimators for Multiple Treatments: Evidence from Experimental Data," IZA Discussion Papers 4451, Institute of Labor Economics (IZA).
    3. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    4. Jones A.M & Rice N, 2009. "Econometric Evaluation of Health Policies," Health, Econometrics and Data Group (HEDG) Working Papers 09/09, HEDG, c/o Department of Economics, University of York.
    5. Huber, Martin & Lechner, Michael & Wunsch, Conny, 2013. "The performance of estimators based on the propensity score," Journal of Econometrics, Elsevier, vol. 175(1), pages 1-21.
    6. Justine Burns & Malcolm Kewsell & Rebecca Thornton, 2009. "Evaluating the Impact of Health Programmes," SALDRU Working Papers 40, Southern Africa Labour and Development Research Unit, University of Cape Town.
    7. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    8. Huber, Martin & Lechner, Michael & Wunsch, Conny, 2010. "How to Control for Many Covariates? Reliable Estimators Based on the Propensity Score," IZA Discussion Papers 5268, Institute of Labor Economics (IZA).
    9. Dettmann, E. & Becker, C. & Schmeißer, C., 2011. "Distance functions for matching in small samples," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1942-1960, May.
    10. Dehejia Rajeev, 2015. "Experimental and Non-Experimental Methods in Development Economics: A Porous Dialectic," Journal of Globalization and Development, De Gruyter, vol. 6(1), pages 47-69, June.
    11. Advani, Arun & Sloczynski, Tymon, 2013. "Mostly Harmless Simulations? On the Internal Validity of Empirical Monte Carlo Studies," IZA Discussion Papers 7874, Institute of Labor Economics (IZA).
    12. Steven Lehrer & Gregory Kordas, 2013. "Matching using semiparametric propensity scores," Empirical Economics, Springer, vol. 44(1), pages 13-45, February.
    13. Henrik Hansen & Ninja Ritter Klejnstrup & Ole Winckler Andersen, 2011. "A Comparison of Model-based and Design-based Impact Evaluations of Interventions in Developing Countries," IFRO Working Paper 2011/16, University of Copenhagen, Department of Food and Resource Economics.
    14. Sant’Anna, Pedro H.C. & Song, Xiaojun, 2019. "Specification tests for the propensity score," Journal of Econometrics, Elsevier, vol. 210(2), pages 379-404.
    15. Rajeev Dehejia, 2013. "The Porous Dialectic: Experimental and Non-Experimental Methods in Development Economics," WIDER Working Paper Series wp-2013-011, World Institute for Development Economic Research (UNU-WIDER).
    16. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2006. "Moving the Goalposts: Addressing Limited Overlap in the Estimation of Average Treatment Effects by Changing the Estimand," NBER Technical Working Papers 0330, National Bureau of Economic Research, Inc.
    17. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    18. Dettmann, Eva & Becker, Claudia & Schmeißer, Christian, 2010. "Is there a Superior Distance Function for Matching in Small Samples?," IWH Discussion Papers 3/2010, Halle Institute for Economic Research (IWH).
    19. Marco Caliendo & Sabine Kopeinig, 2008. "Some Practical Guidance For The Implementation Of Propensity Score Matching," Journal of Economic Surveys, Wiley Blackwell, vol. 22(1), pages 31-72, February.
    20. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jeborg:v:107:y:2014:i:pa:p:344-365. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/jebo .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.