linear-attention-transformer

linear-attention-transformer

lucidrains

Transformer based on a variant of attention that is linear complexity in respect to sequence length

742 Stars
70 Forks
742 Watchers
Python Language
mit License
Cost to Build
$2.46M
Market Value
$7.28M

Growth over time

7 data points  ·  2021-08-01 → 2025-03-01
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about linear-attention-transformer

Question copied to clipboard

What is the lucidrains/linear-attention-transformer GitHub project? Description: "Transformer based on a variant of attention that is linear complexity in respect to sequence length". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone linear-attention-transformer

Clone via HTTPS

git clone https://github.com/lucidrains/linear-attention-transformer.git

Clone via SSH

[email protected]:lucidrains/linear-attention-transformer.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the linear-attention-transformer issue tracker:

Open GitHub Issues