40 Forks
276 Stars
276 Watchers

transformer-in-transformer

Implementation of Transformer in Transformer, pixel level attention paired with patch level attention for image classification, in Pytorch

How to download and setup transformer-in-transformer

Open terminal and run command
git clone https://github.com/lucidrains/transformer-in-transformer.git
git clone is used to create a copy or clone of transformer-in-transformer repositories. You pass git clone a repository URL.
it supports a few different network protocols and corresponding URL formats.

Also you may download zip file with transformer-in-transformer https://github.com/lucidrains/transformer-in-transformer/archive/master.zip

Or simply clone transformer-in-transformer with SSH
[email protected]:lucidrains/transformer-in-transformer.git

If you have some problems with transformer-in-transformer

You may open issue on transformer-in-transformer support forum (system) here: https://github.com/lucidrains/transformer-in-transformer/issues