roberta-wwm-base-distill
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
How to download and setup roberta-wwm-base-distill
Open terminal and run command
git clone https://github.com/xiongma/roberta-wwm-base-distill.git
git clone is used to create a copy or clone of roberta-wwm-base-distill repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with roberta-wwm-base-distill https://github.com/xiongma/roberta-wwm-base-distill/archive/master.zip
Or simply clone roberta-wwm-base-distill with SSH
[email protected]:xiongma/roberta-wwm-base-distill.git
If you have some problems with roberta-wwm-base-distill
You may open issue on roberta-wwm-base-distill support forum (system) here: https://github.com/xiongma/roberta-wwm-base-distill/issuesSimilar to roberta-wwm-base-distill repositories
Here you may see roberta-wwm-base-distill alternatives and analogs
natural-language-processing lectures spaCy HanLP gensim MatchZoo tensorflow-nlp Awesome-pytorch-list spacy-models Repo-2017 stanford-tensorflow-tutorials awesome-nlp nlp_tasks nltk pattern TextBlob CoreNLP allennlp mycroft-core practical-pytorch textract languagetool MITIE machine_learning_examples prose arXivTimes ltp libpostal sling DeepNLP-models-Pytorch