4 Forks
36 Stars
36 Watchers

molecule-attention-transformer

Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules

How to download and setup molecule-attention-transformer

Open terminal and run command
git clone https://github.com/lucidrains/molecule-attention-transformer.git
git clone is used to create a copy or clone of molecule-attention-transformer repositories. You pass git clone a repository URL.
it supports a few different network protocols and corresponding URL formats.

Also you may download zip file with molecule-attention-transformer https://github.com/lucidrains/molecule-attention-transformer/archive/master.zip

Or simply clone molecule-attention-transformer with SSH
[email protected]:lucidrains/molecule-attention-transformer.git

If you have some problems with molecule-attention-transformer

You may open issue on molecule-attention-transformer support forum (system) here: https://github.com/lucidrains/molecule-attention-transformer/issues