Llama-2-Open-Source-LLM-CPU-Inference

Llama-2-Open-Source-LLM-CPU-Inference

kennethleungty

Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A

976 Stars
207 Forks
976 Watchers
Python Language
mit License
100 SrcLog Score
Cost to Build
$186.1K
Market Value
$478.2K

Growth over time

6 data points  ·  2025-08-12 → 2026-04-24
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about Llama-2-Open-Source-LLM-CPU-Inference

Question copied to clipboard

What is the kennethleungty/Llama-2-Open-Source-LLM-CPU-Inference GitHub project? Description: "Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone Llama-2-Open-Source-LLM-CPU-Inference

Clone via HTTPS

git clone https://github.com/kennethleungty/Llama-2-Open-Source-LLM-CPU-Inference.git

Clone via SSH

[email protected]:kennethleungty/Llama-2-Open-Source-LLM-CPU-Inference.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the Llama-2-Open-Source-LLM-CPU-Inference issue tracker:

Open GitHub Issues