IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v36y2012i6p449-474.html
   My bibliography  Save this article

Bounding the Effects of Social Experiments

Author

Listed:
  • Jeffrey Grogger

Abstract

Background: Social experiments frequently exploit data from administrative records. However, most administrative data systems are designed to track earnings or benefit payments among residents within a single state. When an experimental participant moves across state lines, his entries in the data system of his state of origin consist entirely of zeros. Such attrition may bias the estimated effect of the experiment. Objective: To estimate the attrition arising from interstate mobility and provide bounds on the effect of the experiment. Method: Attrition is estimated from runs of zeros at the end of the sample period. Bounds are constructed from these estimates. These estimates can be refined by imposing a stationarity assumption. Results: The width of the estimated bounds depends importantly on the nature of the data being analyzed. Negatively correlated outcomes provide tighter bounds than positively correlated outcomes. Conclusion: Attrition can introduce considerable ambiguity into the estimated effects of experimental programs. To reduce ambiguity, one should collect as much data as possible. Even data on outcomes of no direct interest to the objectives of the experiment may be valuable for reducing the ambiguity that arises due to attrition.

Suggested Citation

  • Jeffrey Grogger, 2012. "Bounding the Effects of Social Experiments," Evaluation Review, , vol. 36(6), pages 449-474, December.
  • Handle: RePEc:sae:evarev:v:36:y:2012:i:6:p:449-474
    DOI: 10.1177/0193841X13482125
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X13482125
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X13482125?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Marianne P. Bitler & Jonah B. Gelbach & Hilary W. Hoynes, 2006. "What Mean Impacts Miss: Distributional Effects of Welfare Reform Experiments," American Economic Review, American Economic Association, vol. 96(4), pages 988-1012, September.
    2. Greenberg, David & Moffitt, Robert & Friedmann, John, 1981. "Underreporting and Experimental Effects on Work Effort: Evidence from the Gary Income Maintenance Experiment," The Review of Economics and Statistics, MIT Press, vol. 63(4), pages 581-589, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Robert Collinson & John Eric Humphries & Nicholas S. Mader & Davin K. Reed & Daniel I. Tannenbaum & Winnie van Dijk, 2022. "Eviction and Poverty in American Cities," NBER Working Papers 30382, National Bureau of Economic Research, Inc.
    2. Veronica Minaya & Judith Scott-Clayton, 2018. "Labor Market Outcomes and Postsecondary Accountability: Are Imperfect Metrics Better Than None?," NBER Chapters, in: Productivity in Higher Education, pages 67-104, National Bureau of Economic Research, Inc.
    3. Veronica Minaya & Judith Scott-Clayton, 2016. "Labor Market Outcomes and Postsecondary Accountability: Are Imperfect Metrics Better than None?," NBER Working Papers 22880, National Bureau of Economic Research, Inc.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    2. Caroline Danielson & Deborah Reed & Qian Li & Jay Liao, "undated". "Sanctions and Time Limits in California's Welfare Program," Mathematica Policy Research Reports 09550879b2754a38b32e03488, Mathematica Policy Research.
    3. Pedro H. C. Sant'Anna & Xiaojun Song & Qi Xu, 2022. "Covariate distribution balance via propensity scores," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 37(6), pages 1093-1120, September.
    4. Elizabeth O. Ananat & Guy Michaels, 2008. "The Effect of Marital Breakup on the Income Distribution of Women with Children," Journal of Human Resources, University of Wisconsin Press, vol. 43(3), pages 611-629.
    5. Okeke, Edward N. & Adepiti, Clement A. & Ajenifuja, Kayode O., 2013. "What is the price of prevention? New evidence from a field experiment," Journal of Health Economics, Elsevier, vol. 32(1), pages 207-218.
    6. Steven F. Lehrer & R. Vincent Pohl & Kyungchul Song, 2016. "Targeting Policies: Multiple Testing and Distributional Treatment Effects," NBER Working Papers 22950, National Bureau of Economic Research, Inc.
    7. Sloczynski, Tymon, 2020. "Interpreting OLS Estimands When Treatment Effects Are Heterogeneous: Smaller Groups Get Larger Weights," IZA Discussion Papers 13283, Institute of Labor Economics (IZA).
    8. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2008. "Nonparametric Tests for Treatment Effect Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 90(3), pages 389-405, August.
    9. Sakos, Grayson & Cerulli, Giovanni & Garbero, Alessandra, 2021. "Beyond the ATE: Idiosyncratic Effect Estimation to Uncover Distributional Impacts Results from 17 Impact Evaluations," 2021 Annual Meeting, August 1-3, Austin, Texas 314017, Agricultural and Applied Economics Association.
    10. Sungwon Lee & Joon H. Ro, 2020. "Nonparametric Tests for Conditional Quantile Independence with Duration Outcomes," Working Papers 2013, Nam Duck-Woo Economic Research Institute, Sogang University (Former Research Institute for Market Economy).
    11. Maier, Michael, 2011. "Tests for distributional treatment effects under unconfoundedness," Economics Letters, Elsevier, vol. 110(1), pages 49-51, January.
    12. Michael J. Kottelenberg & Steven F. Lehrer, 2017. "Targeted or Universal Coverage? Assessing Heterogeneity in the Effects of Universal Child Care," Journal of Labor Economics, University of Chicago Press, vol. 35(3), pages 609-653.
    13. Firpo, Sergio & Galvao, Antonio F. & Kobus, Martyna & Parker, Thomas & Rosa-Dias, Pedro, 2020. "Loss Aversion and the Welfare Ranking of Policy Interventions," IZA Discussion Papers 13176, Institute of Labor Economics (IZA).
    14. Sloczynski, Tymon, 2018. "A General Weighted Average Representation of the Ordinary and Two-Stage Least Squares Estimands," IZA Discussion Papers 11866, Institute of Labor Economics (IZA).
    15. David Neumark & Brian Asquith & Brittany Bass, 2020. "Longer‐Run Effects Of Anti‐Poverty Policies On Disadvantaged Neighborhoods," Contemporary Economic Policy, Western Economic Association International, vol. 38(3), pages 409-434, July.
    16. Havnes, Tarjei & Mogstad, Magne, 2015. "Is universal child care leveling the playing field?," Journal of Public Economics, Elsevier, vol. 127(C), pages 100-114.
    17. Abdullah Kumas & Daniel L. Millimet, 2018. "Reassessing the effects of bilateral tax treaties on US FDI activity," Journal of Economics and Finance, Springer;Academy of Economics and Finance, vol. 42(3), pages 451-470, July.
    18. Xavier D’Haultfoeuille & Pauline Givord, 2014. "La régression quantile en pratique," Économie et Statistique, Programme National Persée, vol. 471(1), pages 85-111.
    19. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    20. Nicholas W Papageorge & Kevin Thom, 2020. "Genes, Education, and Labor Market Outcomes: Evidence from the Health and Retirement Study," Journal of the European Economic Association, European Economic Association, vol. 18(3), pages 1351-1399.

    More about this item

    Keywords

    income support; methodological development;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:36:y:2012:i:6:p:449-474. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.