RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
How to download and setup RWKV-LM
Open terminal and run command
git clone https://github.com/BlinkDL/RWKV-LM.git
git clone is used to create a copy or clone of RWKV-LM repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with RWKV-LM https://github.com/BlinkDL/RWKV-LM/archive/master.zip
Or simply clone RWKV-LM with SSH
[email protected]:BlinkDL/RWKV-LM.git
If you have some problems with RWKV-LM
You may open issue on RWKV-LM support forum (system) here: https://github.com/BlinkDL/RWKV-LM/issuesSimilar to RWKV-LM repositories
Here you may see RWKV-LM alternatives and analogs
tensorflow keras TensorFlow-Examples pytorch CNTK data-science-ipython-notebooks Qix handong1587.github.io telegram-list machine-learning-curriculum caffe machine-learning-for-software-engineers awesome-deep-learning-papers incubator-mxnet lectures cs-video-courses Screenshot-to-code spaCy cheatsheets-ai awesome-deep-learning handson-ml ML-From-Scratch dive-into-machine-learning tfjs tflearn awesome-datascience openpose tfjs-core Machine-Learning-Tutorials EffectiveTensorflow