22 Forks
226 Stars
226 Watchers

memory-efficient-attention-pytorch

Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"

How to download and setup memory-efficient-attention-pytorch

Open terminal and run command
git clone https://github.com/lucidrains/memory-efficient-attention-pytorch.git
git clone is used to create a copy or clone of memory-efficient-attention-pytorch repositories. You pass git clone a repository URL.
it supports a few different network protocols and corresponding URL formats.

Also you may download zip file with memory-efficient-attention-pytorch https://github.com/lucidrains/memory-efficient-attention-pytorch/archive/master.zip

Or simply clone memory-efficient-attention-pytorch with SSH
[email protected]:lucidrains/memory-efficient-attention-pytorch.git

If you have some problems with memory-efficient-attention-pytorch

You may open issue on memory-efficient-attention-pytorch support forum (system) here: https://github.com/lucidrains/memory-efficient-attention-pytorch/issues