NORM: KNOWLEDGE DISTILLATION VIA N-TO-ONE REPRESENTATION MATCHING 보호되어 있는 글입니다. Knowledge Distillation 2024.02.03
Gradient-Guided Knowledge Distillation for Object Detectors 보호되어 있는 글입니다. Knowledge Distillation 2024.02.01
Revisiting Knowledge Distillation via Label Smoothing Regularization 보호되어 있는 글입니다. Knowledge Distillation 2023.12.10
KNOWLEDGE DISTILLATION VIA SOFTMAX REGRESSION REPRESENTATION LEARNING 보호되어 있는 글입니다. Knowledge Distillation 2023.09.21
Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation 보호되어 있는 글입니다. Knowledge Distillation 2023.07.28