IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v36y2012i6p449-474.html
   My bibliography  Save this article

Bounding the Effects of Social Experiments

Author

Listed:
  • Jeffrey Grogger

Abstract

Background: Social experiments frequently exploit data from administrative records. However, most administrative data systems are designed to track earnings or benefit payments among residents within a single state. When an experimental participant moves across state lines, his entries in the data system of his state of origin consist entirely of zeros. Such attrition may bias the estimated effect of the experiment. Objective: To estimate the attrition arising from interstate mobility and provide bounds on the effect of the experiment. Method: Attrition is estimated from runs of zeros at the end of the sample period. Bounds are constructed from these estimates. These estimates can be refined by imposing a stationarity assumption. Results: The width of the estimated bounds depends importantly on the nature of the data being analyzed. Negatively correlated outcomes provide tighter bounds than positively correlated outcomes. Conclusion: Attrition can introduce considerable ambiguity into the estimated effects of experimental programs. To reduce ambiguity, one should collect as much data as possible. Even data on outcomes of no direct interest to the objectives of the experiment may be valuable for reducing the ambiguity that arises due to attrition.

Suggested Citation

  • Jeffrey Grogger, 2012. "Bounding the Effects of Social Experiments," Evaluation Review, , vol. 36(6), pages 449-474, December.
  • Handle: RePEc:sae:evarev:v:36:y:2012:i:6:p:449-474
    DOI: 10.1177/0193841X13482125
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X13482125
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X13482125?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Marianne P. Bitler & Jonah B. Gelbach & Hilary W. Hoynes, 2006. "What Mean Impacts Miss: Distributional Effects of Welfare Reform Experiments," American Economic Review, American Economic Association, vol. 96(4), pages 988-1012, September.
    2. Howard S. Bloom & Larry L. Orr & Stephen H. Bell & George Cave & Fred Doolittle & Winston Lin & Johannes M. Bos, 1997. "The Benefits and Costs of JTPA Title II-A Programs: Key Findings from the National Job Training Partnership Act Study," Journal of Human Resources, University of Wisconsin Press, vol. 32(3), pages 549-576.
    3. Greenberg, David & Moffitt, Robert & Friedmann, John, 1981. "Underreporting and Experimental Effects on Work Effort: Evidence from the Gary Income Maintenance Experiment," The Review of Economics and Statistics, MIT Press, vol. 63(4), pages 581-589, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Robert Collinson & John Eric Humphries & Nicholas Mader & Davin Reed & Daniel Tannenbaum & Winnie van Dijk, 2024. "Eviction and Poverty in American Cities," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 139(1), pages 57-120.
    2. Veronica Minaya & Judith Scott-Clayton, 2018. "Labor Market Outcomes and Postsecondary Accountability: Are Imperfect Metrics Better Than None?," NBER Chapters, in: Productivity in Higher Education, pages 67-104, National Bureau of Economic Research, Inc.
    3. Veronica Minaya & Judith Scott-Clayton, 2016. "Labor Market Outcomes and Postsecondary Accountability: Are Imperfect Metrics Better than None?," NBER Working Papers 22880, National Bureau of Economic Research, Inc.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rothstein, Jesse & Von Wachter, Till, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt7957p9g6, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    2. Toru Kitagawa & Aleksey Tetenov, 2017. "Equality-minded treatment choice," CeMMAP working papers 10/17, Institute for Fiscal Studies.
    3. Toru Kitagawa & Aleksey Tetenov, 2021. "Equality-Minded Treatment Choice," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(2), pages 561-574, March.
    4. David Card & Pablo Ibarrarán & Ferdinando Regalia & David Rosas-Shady & Yuri Soares, 2011. "The Labor Market Impacts of Youth Training in the Dominican Republic," Journal of Labor Economics, University of Chicago Press, vol. 29(2), pages 267-300.
    5. Jeffrey Smith, 2000. "A Critical Survey of Empirical Methods for Evaluating Active Labor Market Policies," Swiss Journal of Economics and Statistics (SJES), Swiss Society of Economics and Statistics (SSES), vol. 136(III), pages 247-268, September.
    6. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    7. Alan B. Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..
    8. Caroline Danielson & Deborah Reed & Qian Li & Jay Liao, "undated". "Sanctions and Time Limits in California's Welfare Program," Mathematica Policy Research Reports 09550879b2754a38b32e03488, Mathematica Policy Research.
    9. Pedro H. C. Sant'Anna & Xiaojun Song & Qi Xu, 2022. "Covariate distribution balance via propensity scores," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 37(6), pages 1093-1120, September.
    10. Javier Alejo & Antonio F. Galvao & Julian Martinez-Iriarte & Gabriel Montes-Rojas, 2024. "Endogenous Heteroskedasticity in Linear Models," Papers 2412.02767, arXiv.org, revised Jan 2025.
    11. Elizabeth O. Ananat & Guy Michaels, 2008. "The Effect of Marital Breakup on the Income Distribution of Women with Children," Journal of Human Resources, University of Wisconsin Press, vol. 43(3), pages 611-629.
    12. Okeke, Edward N. & Adepiti, Clement A. & Ajenifuja, Kayode O., 2013. "What is the price of prevention? New evidence from a field experiment," Journal of Health Economics, Elsevier, vol. 32(1), pages 207-218.
    13. Steven F. Lehrer & R. Vincent Pohl & Kyungchul Song, 2016. "Targeting Policies: Multiple Testing and Distributional Treatment Effects," NBER Working Papers 22950, National Bureau of Economic Research, Inc.
    14. repec:ist:iujspc:v:0:y:2022:i:83:p:101-140 is not listed on IDEAS
    15. Eric Mbakop & Max Tabord‐Meehan, 2021. "Model Selection for Treatment Choice: Penalized Welfare Maximization," Econometrica, Econometric Society, vol. 89(2), pages 825-848, March.
    16. Tymon Słoczyński, 2022. "Interpreting OLS Estimands When Treatment Effects Are Heterogeneous: Smaller Groups Get Larger Weights," The Review of Economics and Statistics, MIT Press, vol. 104(3), pages 501-509, May.
    17. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2008. "Nonparametric Tests for Treatment Effect Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 90(3), pages 389-405, August.
    18. Sakos, Grayson & Cerulli, Giovanni & Garbero, Alessandra, 2021. "Beyond the ATE: Idiosyncratic Effect Estimation to Uncover Distributional Impacts Results from 17 Impact Evaluations," 2021 Annual Meeting, August 1-3, Austin, Texas 314017, Agricultural and Applied Economics Association.
    19. Alexander Gelber & Adam Isen & Judd B. Kessler, 2014. "The Effects of Youth Employment: Evidence from New York City Summer Youth Employment Program Lotteries," NBER Working Papers 20810, National Bureau of Economic Research, Inc.
    20. Undral Byambadalai & Tatsushi Oka & Shota Yasui, 2024. "Estimating Distributional Treatment Effects in Randomized Experiments: Machine Learning for Variance Reduction," Papers 2407.16037, arXiv.org.
    21. Ronald D'Amico & Peter Z. Schochet, "undated". "The Evaluation of the Trade Adjustment Assistance Program: A Synthesis of Major Findings," Mathematica Policy Research Reports c6b34445ad854f5d8178f580f, Mathematica Policy Research.

    More about this item

    Keywords

    income support; methodological development;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:36:y:2012:i:6:p:449-474. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.