Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
What is the kennethleungty/Llama-2-Open-Source-LLM-CPU-Inference GitHub project? Description: "Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard — paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the Llama-2-Open-Source-LLM-CPU-Inference issue tracker:
Open GitHub Issues