torchdistill

torchdistill

yoshitomo-matsubara

A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. πŸ†26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

1.5k Stars
135 Forks
1.5k Watchers
Python Language
mit License
Cost to Build
$670.4K
Market Value
$2.66M

Growth over time

11 data points  Β·  2021-08-01 β†’ 2025-08-01
Stars Forks Watchers
πŸ’¬

How do you feel about this project?

Ask AI about torchdistill

Question copied to clipboard

What is the yoshitomo-matsubara/torchdistill GitHub project? Description: "A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. πŸ†26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard β€” paste it after the AI opens.

How to clone torchdistill

Clone via HTTPS

git clone https://github.com/yoshitomo-matsubara/torchdistill.git

Clone via SSH

[email protected]:yoshitomo-matsubara/torchdistill.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the torchdistill issue tracker:

Open GitHub Issues