vllm

vllm

vllm-project

A high-throughput and memory-efficient inference and serving engine for LLMs

41.6k Stars
6.3k Forks
41.6k Watchers
Python Language
apache-2.0 License
Cost to Build
$2.22M
Market Value
$11.07M

Growth over time

1 data points  ·  2025-03-17 → 2025-03-17
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about vllm

Question copied to clipboard

What is the vllm-project/vllm GitHub project? Description: "A high-throughput and memory-efficient inference and serving engine for LLMs". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone vllm

Clone via HTTPS

git clone https://github.com/vllm-project/vllm.git

Clone via SSH

[email protected]:vllm-project/vllm.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the vllm issue tracker:

Open GitHub Issues