flash-cosine-sim-attention

flash-cosine-sim-attention

lucidrains

Implementation of fused cosine similarity attention in the same style as Flash Attention

210 Stars
12 Forks
210 Watchers
Cuda Language
mit License
Cost to Build
$2.29M
Market Value
$5.22M

Growth over time

3 data points  ·  2022-11-01 → 2025-03-01
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about flash-cosine-sim-attention

Question copied to clipboard

What is the lucidrains/flash-cosine-sim-attention GitHub project? Description: "Implementation of fused cosine similarity attention in the same style as Flash Attention". Written in Cuda. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone flash-cosine-sim-attention

Clone via HTTPS

git clone https://github.com/lucidrains/flash-cosine-sim-attention.git

Clone via SSH

[email protected]:lucidrains/flash-cosine-sim-attention.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the flash-cosine-sim-attention issue tracker:

Open GitHub Issues