ollama-self-hosted

ollama-self-hosted

AiratTop

A simple Docker Compose setup to self-host Ollama and Open WebUI. Run your own private LLMs with GPU acceleration (NVIDIA/AMD) and complete data privacy. Easy to integrate with other services like n8n.

2 Stars
0 Forks
2 Watchers
Shell Language
mit License
53.9 SrcLog Score
Cost to Build
$645
Market Value
$500

Growth over time

3 data points  ·  2026-04-10 → 2026-04-25
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about ollama-self-hosted

Question copied to clipboard

What is the AiratTop/ollama-self-hosted GitHub project? Description: "A simple Docker Compose setup to self-host Ollama and Open WebUI. Run your own private LLMs with GPU acceleration (NVIDIA/AMD) and complete data privacy. Easy to integrate with other services like n8n.". Written in Shell. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone ollama-self-hosted

Clone via HTTPS

git clone https://github.com/AiratTop/ollama-self-hosted.git

Clone via SSH

[email protected]:AiratTop/ollama-self-hosted.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the ollama-self-hosted issue tracker:

Open GitHub Issues