native-sparse-attention-triton
Efficient triton implementation of Native Sparse Attention.
How to download and setup native-sparse-attention-triton
Open terminal and run command
git clone https://github.com/XunhaoLai/native-sparse-attention-triton.git
git clone is used to create a copy or clone of native-sparse-attention-triton repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with native-sparse-attention-triton https://github.com/XunhaoLai/native-sparse-attention-triton/archive/master.zip
Or simply clone native-sparse-attention-triton with SSH
[email protected]:XunhaoLai/native-sparse-attention-triton.git
If you have some problems with native-sparse-attention-triton
You may open issue on native-sparse-attention-triton support forum (system) here: https://github.com/XunhaoLai/native-sparse-attention-triton/issuesSimilar to native-sparse-attention-triton repositories
Here you may see native-sparse-attention-triton alternatives and analogs
natural-language-processing lectures spaCy HanLP gensim MatchZoo tensorflow-nlp Awesome-pytorch-list spacy-models Repo-2017 stanford-tensorflow-tutorials awesome-nlp nlp_tasks nltk pattern TextBlob CoreNLP allennlp mycroft-core practical-pytorch textract languagetool MITIE machine_learning_examples prose arXivTimes ltp libpostal sling DeepNLP-models-Pytorch