rictr is a knowledge distillation library for PyTorch, to make compressing and transferring knowledge between models easy.
refer to rictr.in
-
Logit Distillation (Soft Targets): Based on Hinton et al. (2015), Distilling the Knowledge in a Neural Network.
-
Attention Transfer: Based on Zagoruyko & Komodakis (2016), Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Transfer.
-
Hidden State Distillation: Feature-based distillation by matching intermediate layer activations between teacher and student.
pip install rictr