picollm

picollm

Picovoice

On-device LLM Inference Powered by X-Bit Quantization

262 Stars
14 Forks
262 Watchers
Python Language
apache-2.0 License
Cost to Build
$4.85M
Market Value
$13.84M

Growth over time

1 data points  ·  2025-08-12 → 2025-08-12
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about picollm

Question copied to clipboard

What is the Picovoice/picollm GitHub project? Description: "On-device LLM Inference Powered by X-Bit Quantization". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone picollm

Clone via HTTPS

git clone https://github.com/Picovoice/picollm.git

Clone via SSH

[email protected]:Picovoice/picollm.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the picollm issue tracker:

Open GitHub Issues