hardware-aware-transformers
[ACL'20] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
How to download and setup hardware-aware-transformers
Open terminal and run command
git clone https://github.com/mit-han-lab/hardware-aware-transformers.git
git clone is used to create a copy or clone of hardware-aware-transformers repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with hardware-aware-transformers https://github.com/mit-han-lab/hardware-aware-transformers/archive/master.zip
Or simply clone hardware-aware-transformers with SSH
[email protected]:mit-han-lab/hardware-aware-transformers.git
If you have some problems with hardware-aware-transformers
You may open issue on hardware-aware-transformers support forum (system) here: https://github.com/mit-han-lab/hardware-aware-transformers/issuesSimilar to hardware-aware-transformers repositories
Here you may see hardware-aware-transformers alternatives and analogs
natural-language-processing lectures spaCy HanLP gensim MatchZoo tensorflow-nlp Awesome-pytorch-list spacy-models Repo-2017 stanford-tensorflow-tutorials awesome-nlp nlp_tasks nltk pattern TextBlob CoreNLP allennlp mycroft-core practical-pytorch textract languagetool MITIE machine_learning_examples prose arXivTimes ltp libpostal sling DeepNLP-models-Pytorch