local_llama

local_llama

jlonge4

This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.

262 Stars
44 Forks
262 Watchers
Python Language
apache-2.0 License
Cost to Build
$2.6K
Market Value
$6.5K

Growth over time

1 data points  ·  2025-03-04 → 2025-03-04
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about local_llama

Question copied to clipboard

What is the jlonge4/local_llama GitHub project? Description: "This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone local_llama

Clone via HTTPS

git clone https://github.com/jlonge4/local_llama.git

Clone via SSH

[email protected]:jlonge4/local_llama.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the local_llama issue tracker:

Open GitHub Issues