IDEAS home Printed from https://ideas.repec.org/a/sae/sagope/v9y2019i2p2158244019851675.html
   My bibliography  Save this article

Bootstrap Thompson Sampling and Sequential Decision Problems in the Behavioral Sciences

Author

Listed:
  • Dean Eckles
  • Maurits Kaptein

Abstract

Behavioral scientists are increasingly able to conduct randomized experiments in settings that enable rapidly updating probabilities of assignment to treatments (i.e., arms). Thus, many behavioral science experiments can be usefully formulated as sequential decision problems. This article reviews versions of the multiarmed bandit problem with an emphasis on behavioral science applications. One popular method for such problems is Thompson sampling, which is appealing for randomizing assignment and being asymptoticly consistent in selecting the best arm. Here, we show the utility of bootstrap Thompson sampling (BTS), which replaces the posterior distribution with the bootstrap distribution. This often has computational and practical advantages. We illustrate its robustness to model misspecification, which is a common concern in behavioral science applications. We show how BTS can be readily adapted to be robust to dependent data, such as repeated observations of the same units, which is common in behavioral science applications. We use simulations to illustrate parametric Thompson sampling and BTS for Bernoulli bandits, factorial Gaussian bandits, and bandits with repeated observations of the same units.

Suggested Citation

  • Dean Eckles & Maurits Kaptein, 2019. "Bootstrap Thompson Sampling and Sequential Decision Problems in the Behavioral Sciences," SAGE Open, , vol. 9(2), pages 21582440198, June.
  • Handle: RePEc:sae:sagope:v:9:y:2019:i:2:p:2158244019851675
    DOI: 10.1177/2158244019851675
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/2158244019851675
    Download Restriction: no

    File URL: https://libkey.io/10.1177/2158244019851675?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Grimmer, Justin & Messing, Solomon & Westwood, Sean J., 2017. "Estimating Heterogeneous Treatment Effects and the Effects of Heterogeneous Treatments with Ensemble Methods," Political Analysis, Cambridge University Press, vol. 25(4), pages 413-434, October.
    2. Steven L. Scott, 2010. "A modern Bayesian look at the multi‐armed bandit," Applied Stochastic Models in Business and Industry, John Wiley & Sons, vol. 26(6), pages 639-658, November.
    3. Stefan Wager & Susan Athey, 2018. "Estimation and Inference of Heterogeneous Treatment Effects using Random Forests," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(523), pages 1228-1242, July.
    4. A. Colin Cameron & Douglas L. Miller, 2015. "A Practitioner’s Guide to Cluster-Robust Inference," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 317-372.
    5. Kaptein, Maurits & Eckles, Dean, 2012. "Heterogeneity in the Effects of Online Persuasion," Journal of Interactive Marketing, Elsevier, vol. 26(3), pages 176-188.
    6. Ariel Kleiner & Ameet Talwalkar & Purnamrita Sarkar & Michael I. Jordan, 2014. "A scalable bootstrap for massive data," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(4), pages 795-816, September.
    7. Eric B. Laber & Nick J. Meyer & Brian J. Reich & Krishna Pacifici & Jaime A. Collazo & John M. Drake, 2018. "Optimal treatment allocations in space and time for on‐line control of an emerging infectious disease," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 67(4), pages 743-789, August.
    8. Charles F. Manski, 2004. "Statistical Treatment Rules for Heterogeneous Populations," Econometrica, Econometric Society, vol. 72(4), pages 1221-1246, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kyle Colangelo & Ying-Ying Lee, 2019. "Double debiased machine learning nonparametric inference with continuous treatments," CeMMAP working papers CWP72/19, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    2. Piasenti, Stefano & Valente, Marica & Van Veldhuizen, Roel & Pfeifer, Gregor, 2023. "Does Unfairness Hurt Women? The Effects of Losing Unfair Competitions," Working Papers 2023:7, Lund University, Department of Economics.
    3. Michael C Knaus & Michael Lechner & Anthony Strittmatter, 2021. "Machine learning estimation of heterogeneous causal effects: Empirical Monte Carlo evidence," The Econometrics Journal, Royal Economic Society, vol. 24(1), pages 134-161.
    4. Carlos Fern'andez-Lor'ia & Foster Provost & Jesse Anderton & Benjamin Carterette & Praveen Chandar, 2020. "A Comparison of Methods for Treatment Assignment with an Application to Playlist Generation," Papers 2004.11532, arXiv.org, revised Apr 2022.
    5. Michael Lechner, 2023. "Causal Machine Learning and its use for public policy," Swiss Journal of Economics and Statistics, Springer;Swiss Society of Economics and Statistics, vol. 159(1), pages 1-15, December.
    6. Stefano Caria & Grant Gordon & Maximilian Kasy & Simon Quinn & Soha Shami & Alexander Teytelboym, 2020. "An Adaptive Targeted Field Experiment: Job Search Assistance for Refugees in Jordan," CESifo Working Paper Series 8535, CESifo.
    7. Michael C Knaus, 2022. "Double machine learning-based programme evaluation under unconfoundedness [Econometric methods for program evaluation]," The Econometrics Journal, Royal Economic Society, vol. 25(3), pages 602-627.
    8. Guido W. Imbens, 2020. "Potential Outcome and Directed Acyclic Graph Approaches to Causality: Relevance for Empirical Practice in Economics," Journal of Economic Literature, American Economic Association, vol. 58(4), pages 1129-1179, December.
    9. Jushan Bai & Sung Hoon Choi & Yuan Liao, 2021. "Feasible generalized least squares for panel data with cross-sectional and serial correlations," Empirical Economics, Springer, vol. 60(1), pages 309-326, January.
    10. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    11. Kyle Colangelo & Ying-Ying Lee, 2020. "Double Debiased Machine Learning Nonparametric Inference with Continuous Treatments," Papers 2004.03036, arXiv.org, revised Sep 2023.
    12. Engel, Christoph, 2020. "Estimating heterogeneous reactions to experimental treatments," Journal of Economic Behavior & Organization, Elsevier, vol. 178(C), pages 124-147.
    13. Henrika Langen & Martin Huber, 2022. "How causal machine learning can leverage marketing strategies: Assessing and improving the performance of a coupon campaign," Papers 2204.10820, arXiv.org, revised Jun 2022.
    14. Raymond Duch & Paulina Granados & Denise Laroze & Mauricio Lopez & Marian Ormeño & Ximena Quintanilla, 2021. "La Arquitectura De Elección Mejora La Selección De Pensiones," Working Papers 66, Superintendencia de Pensiones, revised Jan 2021.
    15. Augustine Denteh & Helge Liebert, 2022. "Who Increases Emergency Department Use? New Insights from the Oregon Health Insurance Experiment," Working Papers 2201, Tulane University, Department of Economics.
    16. Johannes Haushofer & Paul Niehaus & Carlos Paramo & Edward Miguel & Michael W. Walker, 2022. "Targeting Impact versus Deprivation," NBER Working Papers 30138, National Bureau of Economic Research, Inc.
    17. Steven F. Lehrer & Tian Xie, 2022. "The Bigger Picture: Combining Econometrics with Analytics Improves Forecasts of Movie Success," Management Science, INFORMS, vol. 68(1), pages 189-210, January.
    18. Patrick Rehill & Nicholas Biddle, 2023. "Transparency challenges in policy evaluation with causal machine learning -- improving usability and accountability," Papers 2310.13240, arXiv.org, revised Mar 2024.
    19. Takanori Ida & Takunori Ishihara & Koichiro Ito & Daido Kido & Toru Kitagawa & Shosei Sakaguchi & Shusaku Sasaki, 2022. "Choosing Who Chooses: Selection-Driven Targeting in Energy Rebate Programs," NBER Working Papers 30469, National Bureau of Economic Research, Inc.
    20. Takanori Ida & Takunori Ishihara & Koichiro Ito & Daido Kido & Toru Kitagawa & Shosei Sakaguchi & Shusaku Sasaki, 2021. "Paternalism, Autonomy, or Both? Experimental Evidence from Energy Saving Programs," Papers 2112.09850, arXiv.org.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:sagope:v:9:y:2019:i:2:p:2158244019851675. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.