inference
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
How to download and setup inference
Open terminal and run command
git clone https://github.com/xorbitsai/inference.git
git clone is used to create a copy or clone of inference repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with inference https://github.com/xorbitsai/inference/archive/master.zip
Or simply clone inference with SSH
[email protected]:xorbitsai/inference.git
If you have some problems with inference
You may open issue on inference support forum (system) here: https://github.com/xorbitsai/inference/issuesSimilar to inference repositories
Here you may see inference alternatives and analogs
tensorflow keras scikit-learn TensorFlow-Examples pytorch face_recognition CNTK data-science-ipython-notebooks Qix handong1587.github.io telegram-list netdata deeplearning4j mlcourse.ai stats Winds machine-learning-curriculum caffe tesseract machine-learning-for-software-engineers awesome-deep-learning-papers incubator-mxnet lectures cs-video-courses julia Screenshot-to-code spaCy cheatsheets-ai awesome-deep-learning python-machine-learning-book