Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
What is the ChenRocks/Distill-BERT-Textgen GitHub project? Description: "Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard — paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the Distill-BERT-Textgen issue tracker:
Open GitHub Issues