IDEAS home Printed from https://ideas.repec.org/a/ucp/jpolec/v114y2006i6p997-1040.html
   My bibliography  Save this article

Learning, Private Information, and the Economic Evaluation of Randomized Experiments

Author

Listed:
  • Tat Y. Chan
  • Barton H. Hamilton

Abstract

Many randomized experiments are plagued by attrition, even among subjects receiving more effective treatments. We estimate the subject's utility associated with the receipt of treatment, as revealed by dropout behavior, to evaluate treatment effects. Utility is a function of both "publicly observed" outcomes and side effects privately observed by the subject. We analyze an influential AIDS clinical trial, ACTG 175, and show that for many subjects, AZT yields the highest level of utility despite having the smallest impact on the publicly observed outcome because of mild side effects. Moreover, although subjects enter the experiment uncertain of treatment effectiveness (and often the treatment received), the learning process implies that early dropout in ACTG 175 is primarily driven by side effects, whereas later attrition reflects declining treatment effectiveness.

Suggested Citation

  • Tat Y. Chan & Barton H. Hamilton, 2006. "Learning, Private Information, and the Economic Evaluation of Randomized Experiments," Journal of Political Economy, University of Chicago Press, vol. 114(6), pages 997-1040, December.
  • Handle: RePEc:ucp:jpolec:v:114:y:2006:i:6:p:997-1040
    DOI: 10.1086/508239
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1086/508239
    File Function: main text
    Download Restriction: Access to the online full text or PDF requires a subscription.

    File URL: https://libkey.io/10.1086/508239?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    2. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    3. Gregory S. Crawford & Matthew Shum, 2005. "Uncertainty and Learning in Pharmaceutical Demand," Econometrica, Econometric Society, vol. 73(4), pages 1137-1173, July.
    4. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(2), pages 651-694.
    5. Charles F. Manski, 1996. "Learning about Treatment Effects from Experiments with Random Assignment of Treatments," Journal of Human Resources, University of Wisconsin Press, vol. 31(4), pages 709-733.
    6. Pakes, Ariel S, 1986. "Patents as Options: Some Estimates of the Value of Holding European Patent Stocks," Econometrica, Econometric Society, vol. 54(4), pages 755-784, July.
    7. Barnard J. & Frangakis C.E. & Hill J.L. & Rubin D.B., 2003. "Principal Stratification Approach to Broken Randomized Experiments: A Case Study of School Choice Vouchers in New York City," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 299-323, January.
    8. Tülin Erdem & Michael P. Keane, 1996. "Decision-Making Under Uncertainty: Capturing Dynamic Brand Choice Processes in Turbulent Consumer Goods Markets," Marketing Science, INFORMS, vol. 15(1), pages 1-20.
    9. Anup Malani, 2006. "Identifying Placebo Effects with Data from Clinical Trials," Journal of Political Economy, University of Chicago Press, vol. 114(2), pages 236-256, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jose M. Fernandez, 2013. "An Empirical Model Of Learning Under Ambiguity: The Case Of Clinical Trials," International Economic Review, Department of Economics, University of Pennsylvania and Osaka University Institute of Social and Economic Research Association, vol. 54(2), pages 549-573, May.
    2. Barbara Sianesi, 2014. "Dealing with randomisation bias in a social experiment: the case of ERA," IFS Working Papers W14/10, Institute for Fiscal Studies.
    3. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    4. Barbara Sianesi, 2013. "Dealing with randomisation bias in a social experiment exploiting the randomisation itself: the case of ERA," IFS Working Papers W13/15, Institute for Fiscal Studies.
    5. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    6. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    7. Tarantino, Emanuele & Simcoe, Timothy S. & Ganglmair, Bernhard, 2018. "Learning When to Quit: An Empirical Model of Experimentation," CEPR Discussion Papers 12733, C.E.P.R. Discussion Papers.
    8. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    9. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    10. James J. Heckman, 2005. "Micro Data, Heterogeneity and the Evaluation of Public Policy Part 2," The American Economist, Sage Publications, vol. 49(1), pages 16-44, March.
    11. Jeffrey A. Smith, 2018. "The usefulness of experiments," IZA World of Labor, Institute of Labor Economics (IZA), pages 436-436, May.
    12. Smith, Jeffrey, 2000. "Evaluation aktiver Arbeitsmarktpolitik : Erfahrungen aus Nordamerika (Evaluating Avtive Labor Market Policies : Lessons from North America)," Mitteilungen aus der Arbeitsmarkt- und Berufsforschung, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 33(3), pages 345-356.
    13. Hu, Yingyao, 2017. "The Econometrics of Unobservables -- Latent Variable and Measurement Error Models and Their Applications in Empirical Industrial Organization and Labor Economics [The Econometrics of Unobservables]," Economics Working Paper Archive 64578, The Johns Hopkins University,Department of Economics, revised 2021.
    14. Gani Aldashev & Georg Kirchsteiger & Alexander Sebald, 2017. "Assignment Procedure Biases in Randomised Policy Experiments," Economic Journal, Royal Economic Society, vol. 127(602), pages 873-895, June.
    15. Robert A. Pollak, 1998. "Notes on How Economists Think . . ," JCPR Working Papers 35, Northwestern University/University of Chicago Joint Center for Poverty Research.
    16. Sylvain Chassang & Erik Snowberg & Ben Seymour & Cayley Bowles, 2015. "Accounting for Behavior in Treatment Effects: New Applications for Blind Trials," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-13, June.
    17. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    18. Riddell, Chris & Riddell, W. Craig, 2016. "When Can Experimental Evidence Mislead? A Re-Assessment of Canada's Self Sufficiency Project," IZA Discussion Papers 9939, Institute of Labor Economics (IZA).
    19. Büttner, Thomas, 2008. "Ankündigungseffekt oder Maßnahmewirkung? Eine Evaluation von Trainingsmaßnahmen zur Überprüfung der Verfügbarkeit (Notification or participation : which treatment actually activates job-seekers? An ev," Zeitschrift für ArbeitsmarktForschung - Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 41(1), pages 25-40.
    20. Alan B. Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..

    More about this item

    Lists

    This item is featured on the following reading lists, Wikipedia, or ReplicationWiki pages:
    1. Learning, Private Information, and the Economic Evaluation of Randomized Experiments (JPE 2006) in ReplicationWiki

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ucp:jpolec:v:114:y:2006:i:6:p:997-1040. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journals Division (email available below). General contact details of provider: https://www.journals.uchicago.edu/JPE .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.