/

knowledge-distillation

pascal-voc
semantic-segmentation
cifar10
cifar100
colab-notebook
google-colab
glue
nlp
coco
transformer
image-classification
object-detection
pytorch
natural-language-processing
imagenet

yoshitomo-matsubara/torchdistill
355ę—„å‰1.2k

A coding-free framework built on PyTorch for reproducible deep learning studies. šŸ†22 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. šŸŽ Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.