A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. π26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. π Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
What is the yoshitomo-matsubara/torchdistill GitHub project? Description: "A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. π26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. π Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard β paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the torchdistill issue tracker:
Open GitHub Issues