IDEAS home Printed from https://ideas.repec.org/p/ecl/stabus/3951.html
   My bibliography  Save this paper

Adapting to Misspecification in Contextual Bandits with Offline Regression Oracles

Author

Listed:
  • Krishnamurthy, Sanath Kumar

    (Stanford University)

  • Hadad, Vitor

    (Stanford University)

  • Athey, Susan

    (Stanford University)

Abstract

Computationally efficient contextual bandits are often based on estimating a predictive model of rewards given contexts and arms using past data. However, when the reward model is not well-specified, the bandit algorithm may incur unexpected regret, so recent work has focused on algorithms that are robust to misspecification. We propose a simple family of contextual bandit algorithms that adapt to misspecification error by reverting to a good safe policy when there is evidence that misspecification is causing a regret increase. Our algorithm requires only an offline regression oracle to ensure regret guarantees that gracefully degrade in terms of a measure of the average misspecification level. Compared to prior work, we attain similar regret guarantees, but we do not rely on a master algorithm, and do not require more robust oracles like online or constrained regression oracles [e.g., Foster et al. (2020a); Krishnamurthy et al. (2020)]. This allows us to design algorithms for more general function approximation classes.

Suggested Citation

  • Krishnamurthy, Sanath Kumar & Hadad, Vitor & Athey, Susan, 2021. "Adapting to Misspecification in Contextual Bandits with Offline Regression Oracles," Research Papers 3951, Stanford University, Graduate School of Business.
  • Handle: RePEc:ecl:stabus:3951
    as

    Download full text from publisher

    File URL: https://www.gsb.stanford.edu/faculty-research/working-papers/adapting-misspecification-contextual-bandits-offline-regression
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Aldo Gael Carranza & Susan Athey, 2023. "Federated Offline Policy Learning with Heterogeneous Observational Data," Papers 2305.12407, arXiv.org.
    2. Susan Athey & Undral Byambadalai & Vitor Hadad & Sanath Kumar Krishnamurthy & Weiwen Leung & Joseph Jay Williams, 2022. "Contextual Bandits in a Survey Experiment on Charitable Giving: Within-Experiment Outcomes versus Policy Learning," Papers 2211.12004, arXiv.org.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ecl:stabus:3951. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/gsstaus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.