7 Forks
116 Stars
116 Watchers

invariant-point-attention

Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module

How to download and setup invariant-point-attention

Open terminal and run command
git clone https://github.com/lucidrains/invariant-point-attention.git
git clone is used to create a copy or clone of invariant-point-attention repositories. You pass git clone a repository URL.
it supports a few different network protocols and corresponding URL formats.

Also you may download zip file with invariant-point-attention https://github.com/lucidrains/invariant-point-attention/archive/master.zip

Or simply clone invariant-point-attention with SSH
[email protected]:lucidrains/invariant-point-attention.git

If you have some problems with invariant-point-attention

You may open issue on invariant-point-attention support forum (system) here: https://github.com/lucidrains/invariant-point-attention/issues