IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i18p3352-d915793.html
   My bibliography  Save this article

Elastic Information Bottleneck

Author

Listed:
  • Yuyan Ni

    (Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190, China)

  • Yanyan Lan

    (Institute for AI Industry Research, Tsinghua University, Beijing 100084, China)

  • Ao Liu

    (School of Computer Science and Technology, University of Chinese Academy of Sciences, Beijing 100049, China)

  • Zhiming Ma

    (Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190, China)

Abstract

Information bottleneck is an information-theoretic principle of representation learning that aims to learn a maximally compressed representation that preserves as much information about labels as possible. Under this principle, two different methods have been proposed, i.e., information bottleneck (IB) and deterministic information bottleneck (DIB), and have gained significant progress in explaining the representation mechanisms of deep learning algorithms. However, these theoretical and empirical successes are only valid with the assumption that training and test data are drawn from the same distribution, which is clearly not satisfied in many real-world applications. In this paper, we study their generalization abilities within a transfer learning scenario, where the target error could be decomposed into three components, i.e., source empirical error, source generalization gap (SG), and representation discrepancy (RD). Comparing IB and DIB on these terms, we prove that DIB’s SG bound is tighter than IB’s while DIB’s RD is larger than IB’s. Therefore, it is difficult to tell which one is better. To balance the trade-off between SG and the RD, we propose an elastic information bottleneck (EIB) to interpolate between the IB and DIB regularizers, which guarantees a Pareto frontier within the IB framework. Additionally, simulations and real data experiments show that EIB has the ability to achieve better domain adaptation results than IB and DIB, which validates the correctness of our theories.

Suggested Citation

  • Yuyan Ni & Yanyan Lan & Ao Liu & Zhiming Ma, 2022. "Elastic Information Bottleneck," Mathematics, MDPI, vol. 10(18), pages 1-26, September.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:18:p:3352-:d:915793
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/18/3352/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/18/3352/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:18:p:3352-:d:915793. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.