IDEAS home Printed from https://ideas.repec.org/a/wly/jpamgt/v26y2007i3p479-506.html
   My bibliography  Save this article

A “politically robust” experimental design for public policy evaluation, with application to the Mexican Universal Health Insurance program

Author

Listed:
  • Gary King

    (Harvard University)

  • Emmanuela Gakidou

    (Harvard University)

  • Nirmala Ravishankar

    (Harvard University)

  • Ryan T. Moore

    (Harvard University)

  • Jason Lakin

    (Harvard University)

  • Manett Vargas

    (National Commission for Social Protection in Health, Ministry of Health, Mexico)

  • Martha María Téllez-Rojo

    (Instituto Nacional de Salud Pública (National Institute of Public Health), Mexico)

  • Juan Eugenio Hernández Ávila

    (Instituto Nacional de Salud Pública (National Institute of Public Health), Mexico)

  • Mauricio Hernández Ávila

    (Undersecretary for Prevention and Health Promotion, Secretaría de Salud (Ministry of Health), Mexico)

  • Héctor Hernández Llamas

    (Conestadistica)

Abstract

We develop an approach to conducting large-scale randomized public policy experiments intended to be more robust to the political interventions that have ruined some or all parts of many similar previous efforts. Our proposed design is insulated from selection bias in some circumstances even if we lose observations; our inferences can still be unbiased even if politics disrupts any two of the three steps in our analytical procedures; and other empirical checks are available to validate the overall design. We illustrate with a design and empirical validation of an evaluation of the Mexican Seguro Popular de Salud (Universal Health Insurance) program we are conducting. Seguro Popular, which is intended to grow to provide medical care, drugs, preventative services, and financial health protection to the 50 million Mexicans without health insurance, is one of the largest health reforms of any country in the last two decades. The evaluation is also large scale, constituting one of the largest policy experiments to date and what may be the largest randomized health policy experiment ever. © 2007 by the Association for Public Policy Analysis and Management

Suggested Citation

  • Gary King & Emmanuela Gakidou & Nirmala Ravishankar & Ryan T. Moore & Jason Lakin & Manett Vargas & Martha María Téllez-Rojo & Juan Eugenio Hernández Ávila & Mauricio Hernández Ávila & Héctor Hernánde, 2007. "A “politically robust” experimental design for public policy evaluation, with application to the Mexican Universal Health Insurance program," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 479-506.
  • Handle: RePEc:wly:jpamgt:v:26:y:2007:i:3:p:479-506
    DOI: 10.1002/pam.20279
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1002/pam.20279
    File Function: Link to full text; subscription required
    Download Restriction: no

    File URL: https://libkey.io/10.1002/pam.20279?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Nickerson, David W., 2005. "Scalable Protocols Offer Efficient Design for Field Experiments," Political Analysis, Cambridge University Press, vol. 13(3), pages 233-252, July.
    2. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    3. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    4. Steven Glazerman & Daniel Mayer & Paul Decker, 2006. "Alternative routes to teaching: The impacts of Teach for America on student achievement and other outcomes," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(1), pages 75-96.
    5. Michael J. Camasso & Radha Jagannathan & Carol Harvey & Mark Killingsworth, 2003. "The use of client surveys to gauge the threat of contamination in welfare reform experiments," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 22(2), pages 207-223.
    6. Thomas S. Dee & Benjamin J. Keys, 2004. "Does merit pay reward good teachers? Evidence from a randomized experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 23(3), pages 471-488.
    7. Harry J. Holzer & John M. Quigley & Steven Raphael, 2003. "Public transit and the spatial distribution of minority employment: Evidence from a natural experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 22(3), pages 415-441.
    8. Elizabeth Ty Wilde & Robinson Hollister, 2007. "How close is close enough? Evaluating propensity score matching using data from a class size reduction experiment," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 26(3), pages 455-477.
    9. David H. Greenberg & Charles Michalopoulos & Philip K. Robin, 2006. "Do experimental and nonexperimental evaluations give different answers about the effectiveness of government-funded training programs?," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 25(3), pages 523-552.
    10. Ho, Daniel & Imai, Kosuke & King, Gary & Stuart, Elizabeth A., 2011. "MatchIt: Nonparametric Preprocessing for Parametric Causal Inference," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 42(i08).
    11. repec:mpr:mprres:4761 is not listed on IDEAS
    12. King, Gary & Honaker, James & Joseph, Anne & Scheve, Kenneth, 2001. "Analyzing Incomplete Political Science Data: An Alternative Algorithm for Multiple Imputation," American Political Science Review, Cambridge University Press, vol. 95(1), pages 49-69, March.
    13. Barnard J. & Frangakis C.E. & Hill J.L. & Rubin D.B., 2003. "Principal Stratification Approach to Broken Randomized Experiments: A Case Study of School Choice Vouchers in New York City," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 299-323, January.
    14. William G. Howell, 2004. "Dynamic selection effects in means-tested, urban school voucher programs," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 23(2), pages 225-250.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Clément de Chaisemartin & Jaime Ramirez-Cuellar, 2024. "At What Level Should One Cluster Standard Errors in Paired and Small-Strata Experiments?," American Economic Journal: Applied Economics, American Economic Association, vol. 16(1), pages 193-212, January.
    2. Timothy Powell‐Jackson & Winnie Chi‐Man Yip & Wei Han, 2015. "Realigning Demand and Supply Side Incentives to Improve Primary Health Care Seeking in Rural China," Health Economics, John Wiley & Sons, Ltd., vol. 24(6), pages 755-772, June.
    3. Azuara, Oliver, 2011. "Effect of universal health coverage on marriage, cohabitation and labor force participation," MPRA Paper 35074, University Library of Munich, Germany.
    4. Sun, Xiaojie & Liu, Xiaoyun & Sun, Qiang & Yip, Winnie & Wagstaff, Adam & Meng, Qingyue, 2014. "The impact of a pay-for-performance scheme on prescription quality in rural China : an impact evaluation," Policy Research Working Paper Series 6892, The World Bank.
    5. Rodrigo Barros, 2008. "Wealthier But Not Much Healthier: Effects of a Health Insurance Program for the Poor in Mexico," Discussion Papers 09-002, Stanford Institute for Economic Policy Research.
    6. Sean P. Corcoran & Jennifer L. Jennings & Sarah R. Cohodes & Carolyn Sattin-Bajaj, 2018. "Leveling the Playing Field for High School Choice: Results from a Field Experiment of Informational Interventions," NBER Working Papers 24471, National Bureau of Economic Research, Inc.
    7. Beath, Andrew & Christia, Fotini & Enikolopov, Ruben, 2017. "Direct democracy and resource allocation: Experimental evidence from Afghanistan," Journal of Development Economics, Elsevier, vol. 124(C), pages 199-213.
    8. Yuehao Bai & Meng Hsuan Hsieh & Jizhou Liu & Max Tabord-Meehan, 2022. "Revisiting the Analysis of Matched-Pair and Stratified Experiments in the Presence of Attrition," Papers 2209.11840, arXiv.org, revised Oct 2023.
    9. Pfutze, Tobias, 2014. "The Effects of Mexico’s Seguro Popular Health Insurance on Infant Mortality: An Estimation with Selection on the Outcome Variable," World Development, Elsevier, vol. 59(C), pages 475-486.
    10. Groh, Matthew & McKenzie, David, 2016. "Macroinsurance for microenterprises: A randomized experiment in post-revolution Egypt," Journal of Development Economics, Elsevier, vol. 118(C), pages 13-25.
    11. Azuara, Oliver & Marinescu, Ioana, 2011. "Informality and the expansion of social protection programs," MPRA Paper 35073, University Library of Munich, Germany.
    12. Zhang, Xuanchuan & Chen, Li-Wu & Mueller, Keith & Yu, Qiao & Liu, Jiapeng & Lin, Ge, 2011. "Tracking the effectiveness of health care reform in China: A case study of community health centers in a district of Beijing," Health Policy, Elsevier, vol. 100(2), pages 181-188.
    13. Faraz Usmani & Marc Jeuland & Subhrendu K. Pattanayak, 2018. "NGOs and the effectiveness of interventions," WIDER Working Paper Series wp-2018-59, World Institute for Development Economic Research (UNU-WIDER).
    14. Rebecca L. Thornton & Laurel E. Hatt & Erica M. Field & Mursaleena Islam & Freddy Solís Diaz & Martha Azucena González, 2010. "Social security health insurance for the informal sector in Nicaragua: a randomized evaluation," Health Economics, John Wiley & Sons, Ltd., vol. 19(S1), pages 181-206, September.
    15. Benjamin F. Arnold & Francois Rerolle & Christine Tedijanto & Sammy M. Njenga & Mahbubur Rahman & Ayse Ercumen & Andrew Mertens & Amy J. Pickering & Audrie Lin & Charles D. Arnold & Kishor Das & Chris, 2024. "Geographic pair matching in large-scale cluster randomized trials," Nature Communications, Nature, vol. 15(1), pages 1-15, December.
    16. Wagstaff, Adam & Nguyen, Ha Thi Hong & Dao, Huyen & Balesd, Sarah, 2014. "Encouraging health insurance for the informal sector : a cluster randomized trial," Policy Research Working Paper Series 6910, The World Bank.
    17. Sarah Cohodes & Sean P. Corcoran & Jennifer Jennings & Carolyn Sattin-Bajaj, 2022. "When Do Informational Interventions Work? Experimental Evidence from New York City High School Choice," Opportunity and Inclusive Growth Institute Working Papers 057, Federal Reserve Bank of Minneapolis.
    18. Spenkch, Jörg L., 2011. "Adverse selection and moral hazard among the poor: evidence from a randomized experiment," MPRA Paper 31443, University Library of Munich, Germany.
    19. Omar Galárraga & Sandra Sosa-Rubí & Aarón Salinas-Rodríguez & Sergio Sesma-Vázquez, 2010. "Health insurance for the poor: impact on catastrophic and out-of-pocket health expenditures in Mexico," The European Journal of Health Economics, Springer;Deutsche Gesellschaft für Gesundheitsökonomie (DGGÖ), vol. 11(5), pages 437-447, October.
    20. Faraz Usmani & Marc Jeuland & Subhrendu Pattanayak, 2018. "NGOs and the effectiveness of interventions," WIDER Working Paper Series 59, World Institute for Development Economic Research (UNU-WIDER).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ubaydli & John A. List, 2019. "How natural field experiments have enhanced our understanding of unemployment," Nature Human Behaviour, Nature, vol. 3(1), pages 33-39, January.
    2. Martin Huber, 2010. "Identification of average treatment effects in social experiments under different forms of attrition," University of St. Gallen Department of Economics working paper series 2010 2010-22, Department of Economics, University of St. Gallen.
    3. Glenn W. Harrison & Morten I. Lau & Hong Il Yoo, 2020. "Risk Attitudes, Sample Selection, and Attrition in a Longitudinal Field Experiment," The Review of Economics and Statistics, MIT Press, vol. 102(3), pages 552-568, July.
    4. Ferraro, Paul J. & Miranda, Juan José, 2014. "The performance of non-experimental designs in the evaluation of environmental programs: A design-replication study using a large-scale randomized experiment as a benchmark," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 344-365.
    5. Andersson, Fredrik W. & Holzer, Harry J. & Lane, Julia & Rosenblum, David & Smith, Jeffrey A., 2013. "Does Federally-Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," IZA Discussion Papers 7621, Institute of Labor Economics (IZA).
    6. Kate Baldwin & Rikhil R. Bhavnani, 2013. "Ancillary Experiments: Opportunities and Challenges," WIDER Working Paper Series wp-2013-024, World Institute for Development Economic Research (UNU-WIDER).
    7. Harrison, Glenn W. & Lau, Morten I. & Elisabet Rutström, E., 2009. "Risk attitudes, randomization to treatment, and self-selection into experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 70(3), pages 498-507, June.
    8. Andrius Kažukauskas & Thomas Broberg & Jūratė Jaraitė, 2021. "Social Comparisons in Real Time: A Field Experiment of Residential Electricity and Water Use," Scandinavian Journal of Economics, Wiley Blackwell, vol. 123(2), pages 558-592, April.
    9. Robert Moffitt, 2002. "The role of randomized field trials in social science research: a perspective from evaluations of reforms of social welfare programs," CeMMAP working papers 23/02, Institute for Fiscal Studies.
    10. Katz, Lawrence & Duncan, Greg J. & Kling, Jeffrey R. & Kessler, Ronald C. & Ludwig, Jens & Sanbonmatsu, Lisa & Liebman, Jeffrey B., 2008. "What Can We Learn about Neighborhood Effects from the Moving to Opportunity Experiment?," Scholarly Articles 2766959, Harvard University Department of Economics.
    11. Iacus, Stefano & King, Gary & Porro, Giuseppe, 2009. "cem: Software for Coarsened Exact Matching," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 30(i09).
    12. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    13. Thomas S. Dee & James Wyckoff, 2015. "Incentives, Selection, and Teacher Performance: Evidence from IMPACT," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 34(2), pages 267-297, March.
    14. Tat Y. Chan & Barton H. Hamilton, 2006. "Learning, Private Information, and the Economic Evaluation of Randomized Experiments," Journal of Political Economy, University of Chicago Press, vol. 114(6), pages 997-1040, December.
    15. Simon Quinn & Tom Gole, 2014. "Committees and Status Quo Bias: Structural Evidence from a Randomized Field Experiment," Economics Series Working Papers 733, University of Oxford, Department of Economics.
    16. Rothstein, Jesse & von Wachter, Till, 2016. "Social Experiments in the Labor Market," Institute for Research on Labor and Employment, Working Paper Series qt6605k20b, Institute of Industrial Relations, UC Berkeley.
    17. Katherine Baicker & Theodore Svoronos, 2019. "Testing the Validity of the Single Interrupted Time Series Design," CID Working Papers 364, Center for International Development at Harvard University.
    18. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    19. Carolyn J. Heinrich, 2008. "Advancing public sector performance analysis," Applied Stochastic Models in Business and Industry, John Wiley & Sons, vol. 24(5), pages 373-389, September.
    20. Martin Huber, 2012. "Identification of Average Treatment Effects in Social Experiments Under Alternative Forms of Attrition," Journal of Educational and Behavioral Statistics, , vol. 37(3), pages 443-474, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:jpamgt:v:26:y:2007:i:3:p:479-506. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www3.interscience.wiley.com/journal/34787/home .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.