molecule-attention-transformer

molecule-attention-transformer

lucidrains

Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules

36 Stars
4 Forks
36 Watchers
Python Language
mit License
Cost to Build
$4.1K
Market Value
$5.1K

Growth over time

1 data points  ·  2021-11-21 → 2021-11-21
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about molecule-attention-transformer

Question copied to clipboard

What is the lucidrains/molecule-attention-transformer GitHub project? Description: "Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone molecule-attention-transformer

Clone via HTTPS

git clone https://github.com/lucidrains/molecule-attention-transformer.git

Clone via SSH

[email protected]:lucidrains/molecule-attention-transformer.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the molecule-attention-transformer issue tracker:

Open GitHub Issues