Browsing by Author Baosheng, Yu

Jump to: 0-9 A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
or enter first few letters:  
Showing results [1 - 1] / 1
  • Authors: Jianping, Gou; Xiangshuo, Xiong; Baosheng, Yu;  Advisor: -;  Co-Author: - (2023)

    Knowledge distillation is a simple yet effective technique for deep model compression, which aims to transfer the knowledge learned by a large teacher model to a small student model. To mimic how the teacher teaches the student, existing knowledge distillation methods mainly adapt an unidirectional knowledge transfer, where the knowledge extracted from different intermedicate layers of the teacher model is used to guide the student model. However, it turns out that the students can learn more effectively through multi-stage learning with a self-reflection in the real-world education scenario, which is nevertheless ignored by current knowledge distillation methods. Inspired by this, we...