yzma

yzma

hybridgroup

Go with your own intelligence - Go applications that directly integrate llama.cpp for local inference using hardware acceleration.

373 Stars
14 Forks
373 Watchers
Go Language
other License
100 SrcLog Score
Cost to Build
$37.6K
Market Value
$150.3K

Growth over time

1 data points  ·  2026-04-08 → 2026-04-08
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about yzma

Question copied to clipboard

What is the hybridgroup/yzma GitHub project? Description: "Go with your own intelligence - Go applications that directly integrate llama.cpp for local inference using hardware acceleration.". Written in Go. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone yzma

Clone via HTTPS

git clone https://github.com/hybridgroup/yzma.git

Clone via SSH

[email protected]:hybridgroup/yzma.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the yzma issue tracker:

Open GitHub Issues