IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i22p4285-d974201.html
   My bibliography  Save this article

Effective Online Knowledge Distillation via Attention-Based Model Ensembling

Author

Listed:
  • Diana-Laura Borza

    (Computer Science Department, Babes Bolyai University, 400084 Cluj-Napoca, Romania
    These authors contributed equally to this work.)

  • Adrian Sergiu Darabant

    (Computer Science Department, Babes Bolyai University, 400084 Cluj-Napoca, Romania
    These authors contributed equally to this work.)

  • Tudor Alexandru Ileni

    (Computer Science Department, Babes Bolyai University, 400084 Cluj-Napoca, Romania
    These authors contributed equally to this work.)

  • Alexandru-Ion Marinescu

    (Computer Science Department, Babes Bolyai University, 400084 Cluj-Napoca, Romania
    These authors contributed equally to this work.)

Abstract

Large-scale deep learning models have achieved impressive results on a variety of tasks; however, their deployment on edge or mobile devices is still a challenge due to the limited available memory and computational capability. Knowledge distillation is an effective model compression technique, which can boost the performance of a lightweight student network by transferring the knowledge from a more complex model or an ensemble of models. Due to its reduced size, this lightweight model is more suitable for deployment on edge devices. In this paper, we introduce an online knowledge distillation framework, which relies on an original attention mechanism to effectively combine the predictions of a cohort of lightweight (student) networks into a powerful ensemble, and use this as a distillation signal. The proposed aggregation strategy uses the predictions of the individual students as well as ground truth data to determine a set of weights needed for ensembling these predictions. This mechanism is solely used during system training. When testing or at inference time, a single, lightweight student is extracted and used. The extensive experiments we performed on several image classification benchmarks, both by training models from scratch (on CIFAR-10, CIFAR-100, and Tiny ImageNet datasets) and using transfer learning (on Oxford Pets and Oxford Flowers datasets), showed that the proposed framework always leads to an improvement in the accuracy of knowledge-distilled students and demonstrates the effectiveness of the proposed solution. Moreover, in the case of ResNet architecture, we observed that the knowledge-distilled model achieves a higher accuracy than a deeper, individually trained ResNet model.

Suggested Citation

  • Diana-Laura Borza & Adrian Sergiu Darabant & Tudor Alexandru Ileni & Alexandru-Ion Marinescu, 2022. "Effective Online Knowledge Distillation via Attention-Based Model Ensembling," Mathematics, MDPI, vol. 10(22), pages 1-15, November.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:22:p:4285-:d:974201
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/22/4285/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/22/4285/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:22:p:4285-:d:974201. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.