IDEAS home Printed from https://ideas.repec.org/a/ucp/jpolec/v114y2006i6p997-1040.html
   My bibliography  Save this article

Learning, Private Information, and the Economic Evaluation of Randomized Experiments

Author

Listed:
  • Tat Y. Chan
  • Barton H. Hamilton

Abstract

Many randomized experiments are plagued by attrition, even among subjects receiving more effective treatments. We estimate the subject's utility associated with the receipt of treatment, as revealed by dropout behavior, to evaluate treatment effects. Utility is a function of both "publicly observed" outcomes and side effects privately observed by the subject. We analyze an influential AIDS clinical trial, ACTG 175, and show that for many subjects, AZT yields the highest level of utility despite having the smallest impact on the publicly observed outcome because of mild side effects. Moreover, although subjects enter the experiment uncertain of treatment effectiveness (and often the treatment received), the learning process implies that early dropout in ACTG 175 is primarily driven by side effects, whereas later attrition reflects declining treatment effectiveness.

Suggested Citation

  • Tat Y. Chan & Barton H. Hamilton, 2006. "Learning, Private Information, and the Economic Evaluation of Randomized Experiments," Journal of Political Economy, University of Chicago Press, vol. 114(6), pages 997-1040, December.
  • Handle: RePEc:ucp:jpolec:v:114:y:2006:i:6:p:997-1040
    DOI: 10.1086/508239
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1086/508239
    File Function: main text
    Download Restriction: Access to the online full text or PDF requires a subscription.

    File URL: https://libkey.io/10.1086/508239?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    2. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, Oxford University Press, vol. 115(2), pages 651-694.
    3. Gregory S. Crawford & Matthew Shum, 2005. "Uncertainty and Learning in Pharmaceutical Demand," Econometrica, Econometric Society, vol. 73(4), pages 1137-1173, July.
    4. Anup Malani, 2006. "Identifying Placebo Effects with Data from Clinical Trials," Journal of Political Economy, University of Chicago Press, vol. 114(2), pages 236-256, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jose M. Fernandez, 2013. "An Empirical Model Of Learning Under Ambiguity: The Case Of Clinical Trials," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 54(2), pages 549-573, May.
    2. Sylvain Chassang & Erik Snowberg & Ben Seymour & Cayley Bowles, 2015. "Accounting for Behavior in Treatment Effects: New Applications for Blind Trials," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-13, June.
    3. Riddell, Chris & Riddell, W. Craig, 2016. "When Can Experimental Evidence Mislead? A Re-Assessment of Canada's Self Sufficiency Project," IZA Discussion Papers 9939, Institute of Labor Economics (IZA).
    4. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    5. repec:pri:cepsud:87krueger is not listed on IDEAS
    6. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    7. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    8. Alan B. Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..
    9. Rothstein, J & von Wachter, T, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt7957p9g6, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    10. Jesse Rothstein & Till von Wachter, 2016. "Social Experiments in the Labor Market," NBER Working Papers 22585, National Bureau of Economic Research, Inc.
    11. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    12. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    13. Gani Aldashev & Georg Kirchsteiger & Alexander Sebald, 2017. "Assignment Procedure Biases in Randomised Policy Experiments," Economic Journal, Royal Economic Society, vol. 127(602), pages 873-895, June.
    14. Barbara Sianesi, 2013. "Dealing with randomisation bias in a social experiment exploiting the randomisation itself: the case of ERA," IFS Working Papers W13/15, Institute for Fiscal Studies.
    15. Rothstein, Jesse & von Wachter, Till, 2016. "Social Experiments in the Labor Market," Department of Economics, Working Paper Series qt6605k20b, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    16. Alan Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..
    17. Raaum, Oddbjorn & Torp, Hege, 2002. "Labour market training in Norway--effect on earnings," Labour Economics, Elsevier, vol. 9(2), pages 207-247, April.
    18. Pablo Ibarrán & David Rosas & Yuri Suarez Dillon Soares, 2006. "Impact Evaluation of a Youth Job Training Program in the Dominican Republic:Ex-Post Evaluation Report of the Labor Training and Modernization Project (DR0134)," OVE Working Papers 0306, Inter-American Development Bank, Office of Evaluation and Oversight (OVE).
    19. Dionissi Aliprantis, 2013. "Covariates and causal effects: the problem of context," Working Papers (Old Series) 1310, Federal Reserve Bank of Cleveland.
    20. Fortin, Bernard, 1997. "Dépendance à l’égard de l’aide sociale et réforme de la sécurité du revenu," L'Actualité Economique, Société Canadienne de Science Economique, vol. 73(4), pages 557-573, décembre.
    21. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.

    More about this item

    Lists

    This item is featured on the following reading lists, Wikipedia, or ReplicationWiki pages:
    1. Learning, Private Information, and the Economic Evaluation of Randomized Experiments (JPE 2006) in ReplicationWiki

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucp:jpolec:v:114:y:2006:i:6:p:997-1040. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: https://www.journals.uchicago.edu/JPE .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journals Division (email available below). General contact details of provider: https://www.journals.uchicago.edu/JPE .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.