An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow
What is the Rishit-dagli/Fast-Transformer GitHub project? Description: "An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow". Written in Jupyter Notebook. Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard — paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the Fast-Transformer issue tracker:
Open GitHub Issues