attention_sinks

attention_sinks

tomaarsen

Extend existing LLMs way beyond the original training length with constant memory usage, without retraining

736 Stars
44 Forks
736 Watchers
Python Language
apache-2.0 License
100 SrcLog Score
Cost to Build
$393.9K
Market Value
$911.2K

Growth over time

2 data points  ·  2026-04-09 → 2026-04-17
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about attention_sinks

Question copied to clipboard

What is the tomaarsen/attention_sinks GitHub project? Description: "Extend existing LLMs way beyond the original training length with constant memory usage, without retraining". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone attention_sinks

Clone via HTTPS

git clone https://github.com/tomaarsen/attention_sinks.git

Clone via SSH

[email protected]:tomaarsen/attention_sinks.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the attention_sinks issue tracker:

Open GitHub Issues