IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/21013.html
   My bibliography  Save this paper

Effects of Peer Counseling to Support Breastfeeding: Assessing the External Validity of a Randomized Field Experiment

Author

Listed:
  • Onur Altindag
  • Theodore J. Joyce
  • Julie A. Reeder

Abstract

In an effort to improve breastfeeding, the Oregon WIC Program tested whether a relatively low-cost telephone peer counseling initiative to support breastfeeding could increase the initiation and duration of exclusive breastfeeding among its participants. They conducted a large randomized field experiment (RFE) with over 1900 women from four WIC agencies in the state. In this study we use data from the RFE along with administrative data from the rest of the state to assess whether the results from the RFE can be extended to other agencies in the state. We find small or non-existent effects of peer counseling in the non-experimental settings, which suggest that the experimental estimates may reflect Hawthorne effects. We present evidence of selection into RFE in that exclusive breastfeeding among the controls is significantly greater than among women who were offered but declined to participate in the RFE as well as from women in the rest of the state who had no access to peer counseling. We conclude that despite the strong internal validity of the RFE, extending the program to other agencies in the state would have a limited impact at best on exclusive breastfeeding.

Suggested Citation

  • Onur Altindag & Theodore J. Joyce & Julie A. Reeder, 2015. "Effects of Peer Counseling to Support Breastfeeding: Assessing the External Validity of a Randomized Field Experiment," NBER Working Papers 21013, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:21013
    Note: EH
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w21013.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    2. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2008. "Nonparametric Tests for Treatment Effect Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 90(3), pages 389-405, August.
    3. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    4. Michael Woolcock, 2013. "Using Case Studies to Explore the External Validity of ‘Complex’ Development Interventions," CID Working Papers 270, Center for International Development at Harvard University.
    5. Alberto Abadie & Matthew M. Chingos & Martin R. West, 2018. "Endogenous Stratification in Randomized Experiments," The Review of Economics and Statistics, MIT Press, vol. 100(4), pages 567-580, October.
    6. Rodrik, Dani, 2008. "The New Development Economics: We Shall Experiment, but How Shall We Learn?," Working Paper Series rwp08-055, Harvard University, John F. Kennedy School of Government.
    7. Woolcock, Michael, 2013. "Using Case Studies to Explore the External Validity of 'Complex' Development Interventions," Working Paper Series rwp13-048, Harvard University, John F. Kennedy School of Government.
    8. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    9. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    10. Abhijit V. Banerjee & Esther Duflo, 2009. "The Experimental Approach to Development Economics," Annual Review of Economics, Annual Reviews, vol. 1(1), pages 151-178, May.
    11. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    12. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    13. Oliveira, Victor & Frazao, Elizabeth & Smallwood, David M., 2010. "Rising Infant Formula Costs to the WIC Program: Recent Trends in Rebates and Wholesale Prices," Economic Research Report 59384, United States Department of Agriculture, Economic Research Service.
    14. Martin Ravallion, 2012. "Fighting Poverty One Experiment at a Time: Poor Economics: A Radical Rethinking of the Way to Fight Global Poverty : Review Essay," Journal of Economic Literature, American Economic Association, vol. 50(1), pages 103-114, March.
    15. repec:pri:rpdevs:deaton_instruments_randomization_learning_all_04april_2010 is not listed on IDEAS
    16. Pritchett Lant & Sandefur Justin, 2014. "Context Matters for Size: Why External Validity Claims and Development Practice do not Mix," Journal of Globalization and Development, De Gruyter, vol. 4(2), pages 161-197, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Tarek Azzam & Michael Bates & David Fairris, 2019. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202002, University of California at Riverside, Department of Economics.
    2. Azzam, Tarek & Bates, Michael D. & Fairris, David, 2022. "Do learning communities increase first year college retention? Evidence from a randomized control trial," Economics of Education Review, Elsevier, vol. 89(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    2. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    3. Sarah Tahamont & Zubin Jelveh & Aaron Chalfin & Shi Yan & Benjamin Hansen, 2019. "Administrative Data Linking and Statistical Power Problems in Randomized Experiments," NBER Working Papers 25657, National Bureau of Economic Research, Inc.
    4. Yonatan Eyal, 2020. "Self-Assessment Variables as a Source of Information in the Evaluation of Intervention Programs: A Theoretical and Methodological Framework," SAGE Open, , vol. 10(1), pages 21582440198, January.
    5. Ashis Das & Jed Friedman & Eeshani Kandpal, 2018. "Does involvement of local NGOs enhance public service delivery? Cautionary evidence from a malaria‐prevention program in India," Health Economics, John Wiley & Sons, Ltd., vol. 27(1), pages 172-188, January.
    6. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    7. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    8. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    9. Gani Aldashev & Georg Kirchsteiger & Alexander Sebald, 2017. "Assignment Procedure Biases in Randomised Policy Experiments," Economic Journal, Royal Economic Society, vol. 127(602), pages 873-895, June.
    10. Judith Favereau & Nicolas Brisset, 2016. "Randomization of What? Moving from Libertarian to "Democratic Paternalism"," GREDEG Working Papers 2016-34, Groupe de REcherche en Droit, Economie, Gestion (GREDEG CNRS), Université Côte d'Azur, France.
    11. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    12. Lechner, Michael & Wunsch, Conny, 2013. "Sensitivity of matching-based program evaluations to the availability of control variables," Labour Economics, Elsevier, vol. 21(C), pages 111-121.
    13. W. Bentley MacLeod, 2017. "Viewpoint: The human capital approach to inference," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 50(1), pages 5-39, February.
    14. Jones A.M & Rice N, 2009. "Econometric Evaluation of Health Policies," Health, Econometrics and Data Group (HEDG) Working Papers 09/09, HEDG, c/o Department of Economics, University of York.
    15. Sokbae Lee & Yoon-Jae Whang, 2009. "Nonparametric Tests of Conditional Treatment Effects," Cowles Foundation Discussion Papers 1740, Cowles Foundation for Research in Economics, Yale University.
    16. Kugler Franziska & Schwerdt Guido & Wößmann Ludger, 2014. "Ökonometrische Methoden zur Evaluierung kausaler Effekte der Wirtschaftspolitik," Perspektiven der Wirtschaftspolitik, De Gruyter, vol. 15(2), pages 105-132, June.
    17. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    18. Mattoo, Aaditya & Cadot, Olivier & Gourdon, Julien & Fernandes, Ana Margarida, 2011. "Impact Evaluation of Trade Interventions: Paving the Way," CEPR Discussion Papers 8638, C.E.P.R. Discussion Papers.
    19. Judith Favereau & Nicolas Brisset, 2016. "Randomization of What? Moving from Libertarian to "Democratic Paternalism". GREDEG Working Papers Series," Working Papers hal-02092638, HAL.
    20. Ralitza Dimova, 2019. "A Debate that Fatigues…: To Randomise or Not to Randomise; What’s the Real Question?," The European Journal of Development Research, Palgrave Macmillan;European Association of Development Research and Training Institutes (EADI), vol. 31(2), pages 163-168, April.

    More about this item

    JEL classification:

    • I12 - Health, Education, and Welfare - - Health - - - Health Behavior
    • I18 - Health, Education, and Welfare - - Health - - - Government Policy; Regulation; Public Health

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:21013. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.