UNI-MB - logo
UMNIK - logo
 
E-resources
Full text
Peer reviewed
  • Online Subclass Knowledge D...
    Tzelepi, Maria; Passalis, Nikolaos; Tefas, Anastasios

    Expert systems with applications, 11/2021, Volume: 181
    Journal Article

    •A novel distillation method aiming to reveal the subclass similarities is proposed.•The OSKD method derives the soft labels from the model itself, in an online manner.•The OSKD method is model-agnostic.•The experiments validate the effectiveness of the OSKD method. Knowledge Distillation has been established as a highly promising approach for training compact and faster models by transferring knowledge from more heavyweight and powerful models, so as to satisfy the computation and storage requirements of deploying state-of-the-art deep neural models on embedded systems. However, conventional knowledge distillation requires multiple stages of training rendering it a computationally and memory demanding procedure. In this paper, a novel single-stage self knowledge distillation method is proposed, namely Online Subclass Knowledge Distillation (OSKD), that aims at revealing the similarities inside classes, improving the performance of any deep neural model in an online manner. Hence, as opposed to existing online distillation methods, we are able to acquire further knowledge from the model itself, without building multiple identical models or using multiple models to teach each other, rendering the OSKD approach more effective. The experimental evaluation on five datasets indicates that the proposed method enhances the classification performance, while comparison results against existing online distillation methods validate the superiority of the proposed method.