Go manage your Ollama models
Agentic Coding Rules, Templates etc...
An MCP server that provides LLMs with the latest stable package versions when coding
An MCP server that provides LLMs with efficient access to package documentation across multiple programming languages
My LLM Templates (Ollama Modelfiles & Tabby Templates + Presets)
Continuous integration for the Linux Kernel - Built within Docker
A lightweight frontend for self-hosted Firecrawl instances