torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
How to download and setup torchdistill
Open terminal and run command
git clone https://github.com/yoshitomo-matsubara/torchdistill.git
git clone is used to create a copy or clone of torchdistill repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with torchdistill https://github.com/yoshitomo-matsubara/torchdistill/archive/master.zip
Or simply clone torchdistill with SSH
[email protected]:yoshitomo-matsubara/torchdistill.git
If you have some problems with torchdistill
You may open issue on torchdistill support forum (system) here: https://github.com/yoshitomo-matsubara/torchdistill/issuesSimilar to torchdistill repositories
Here you may see torchdistill alternatives and analogs
natural-language-processing lectures spaCy HanLP gensim tensorflow_cookbook MatchZoo tensorflow-nlp Awesome-pytorch-list spacy-models TagUI Repo-2017 stanford-tensorflow-tutorials awesome-nlp franc nlp_tasks nltk pattern TextBlob CoreNLP allennlp mycroft-core practical-pytorch textract languagetool MITIE machine_learning_examples prose arXivTimes ltp