100 Forks
929 Stars
929 Watchers

torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

How to download and setup torchdistill

Open terminal and run command
git clone https://github.com/yoshitomo-matsubara/torchdistill.git
git clone is used to create a copy or clone of torchdistill repositories. You pass git clone a repository URL.
it supports a few different network protocols and corresponding URL formats.

Also you may download zip file with torchdistill https://github.com/yoshitomo-matsubara/torchdistill/archive/master.zip

Or simply clone torchdistill with SSH
[email protected]:yoshitomo-matsubara/torchdistill.git

If you have some problems with torchdistill

You may open issue on torchdistill support forum (system) here: https://github.com/yoshitomo-matsubara/torchdistill/issues