607 Forks
7370 Stars
7370 Watchers

inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.

How to download and setup inference

Open terminal and run command
git clone https://github.com/xorbitsai/inference.git
git clone is used to create a copy or clone of inference repositories. You pass git clone a repository URL.
it supports a few different network protocols and corresponding URL formats.

Also you may download zip file with inference https://github.com/xorbitsai/inference/archive/master.zip

Or simply clone inference with SSH
[email protected]:xorbitsai/inference.git

If you have some problems with inference

You may open issue on inference support forum (system) here: https://github.com/xorbitsai/inference/issues