memory-efficient-attention-pytorch
Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"
How to download and setup memory-efficient-attention-pytorch
Open terminal and run command
git clone https://github.com/lucidrains/memory-efficient-attention-pytorch.git
git clone is used to create a copy or clone of memory-efficient-attention-pytorch repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with memory-efficient-attention-pytorch https://github.com/lucidrains/memory-efficient-attention-pytorch/archive/master.zip
Or simply clone memory-efficient-attention-pytorch with SSH
[email protected]:lucidrains/memory-efficient-attention-pytorch.git
If you have some problems with memory-efficient-attention-pytorch
You may open issue on memory-efficient-attention-pytorch support forum (system) here: https://github.com/lucidrains/memory-efficient-attention-pytorch/issuesSimilar to memory-efficient-attention-pytorch repositories
Here you may see memory-efficient-attention-pytorch alternatives and analogs
deeplearning4j machine-learning-for-software-engineers incubator-mxnet spaCy cheatsheets-ai gun php-ml TensorLayer awesome-artificial-intelligence AlgoWiki papers-I-read EmojiIntelligence PyGame-Learning-Environment deep-trading-agent caffe2 AirSim pipeline diffbot-php-client mycroft-core iOS_ML warriorjs nd4j optaplanner high-school-guide-to-machine-learning Dragonfire auto_ml gophernotes deeplearning4j-examples DeepPavlov polyaxon