Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
How to download and setup Distill-BERT-Textgen
Open terminal and run command
git clone https://github.com/ChenRocks/Distill-BERT-Textgen.git
git clone is used to create a copy or clone of Distill-BERT-Textgen repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with Distill-BERT-Textgen https://github.com/ChenRocks/Distill-BERT-Textgen/archive/master.zip
Or simply clone Distill-BERT-Textgen with SSH
[email protected]:ChenRocks/Distill-BERT-Textgen.git
If you have some problems with Distill-BERT-Textgen
You may open issue on Distill-BERT-Textgen support forum (system) here: https://github.com/ChenRocks/Distill-BERT-Textgen/issuesSimilar to Distill-BERT-Textgen repositories
Here you may see Distill-BERT-Textgen alternatives and analogs
natural-language-processing lectures spaCy HanLP gensim MatchZoo tensorflow-nlp Awesome-pytorch-list spacy-models Repo-2017 stanford-tensorflow-tutorials awesome-nlp nlp_tasks nltk pattern TextBlob CoreNLP allennlp mycroft-core practical-pytorch textract languagetool MITIE machine_learning_examples prose arXivTimes ltp libpostal sling DeepNLP-models-Pytorch