TurboTransformers
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
How to download and setup TurboTransformers
Open terminal and run command
git clone https://github.com/Tencent/TurboTransformers.git
git clone is used to create a copy or clone of TurboTransformers repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with TurboTransformers https://github.com/Tencent/TurboTransformers/archive/master.zip
Or simply clone TurboTransformers with SSH
[email protected]:Tencent/TurboTransformers.git
If you have some problems with TurboTransformers
You may open issue on TurboTransformers support forum (system) here: https://github.com/Tencent/TurboTransformers/issuesSimilar to TurboTransformers repositories
Here you may see TurboTransformers alternatives and analogs
lectures spaCy HanLP gensim tensorflow_cookbook tensorflow-nlp Awesome-pytorch-list spacy-models TagUI Repo-2017 stanford-tensorflow-tutorials awesome-nlp franc nlp_tasks nltk TextBlob CoreNLP allennlp mycroft-core practical-pytorch prose ltp libpostal sling DeepNLP-models-Pytorch attention-is-all-you-need-pytorch kaggle-CrowdFlower hubot-natural chat KGQA-Based-On-medicine