Transformer based on a variant of attention that is linear complexity in respect to sequence length
What is the lucidrains/linear-attention-transformer GitHub project? Description: "Transformer based on a variant of attention that is linear complexity in respect to sequence length". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard — paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the linear-attention-transformer issue tracker:
Open GitHub Issues