invariant-point-attention

invariant-point-attention

lucidrains

Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module

116 Stars
7 Forks
116 Watchers
Python Language
mit License
Cost to Build
$12.4K
Market Value
$19.9K

Growth over time

6 data points  ·  2021-08-01 → 2023-03-01
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about invariant-point-attention

Question copied to clipboard

What is the lucidrains/invariant-point-attention GitHub project? Description: "Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone invariant-point-attention

Clone via HTTPS

git clone https://github.com/lucidrains/invariant-point-attention.git

Clone via SSH

[email protected]:lucidrains/invariant-point-attention.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the invariant-point-attention issue tracker:

Open GitHub Issues