OnnxStream

OnnxStream

vitoplantamura

Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-V supported. Accelerated by XNNPACK. Python, C# and JS(WASM) bindings available.

2.1k Stars
92 Forks
2.1k Watchers
C++ Language
other License
100 SrcLog Score
Cost to Build
$2.92M
Market Value
$13.03M

Growth over time

2 data points  ·  2025-04-01 → 2026-04-01
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about OnnxStream

Question copied to clipboard

What is the vitoplantamura/OnnxStream GitHub project? Description: "Lightweight inference library for ONNX files, written in C++. It can run Stable Diffusion XL 1.0 on a RPI Zero 2 (or in 298MB of RAM) but also Mistral 7B on desktops and servers. ARM, x86, WASM, RISC-V supported. Accelerated by XNNPACK. Python, C# and JS(WASM) bindings available.". Written in C++. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone OnnxStream

Clone via HTTPS

git clone https://github.com/vitoplantamura/OnnxStream.git

Clone via SSH

[email protected]:vitoplantamura/OnnxStream.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the OnnxStream issue tracker:

Open GitHub Issues