Fast-Transformer
An implementation of Fastformer: Additive Attention Can Be All You Need, a Transformer Variant in TensorFlow
How to download and setup Fast-Transformer
Open terminal and run command
git clone https://github.com/Rishit-dagli/Fast-Transformer.git
git clone is used to create a copy or clone of Fast-Transformer repositories.
You pass git clone a repository URL. it supports a few different network protocols and corresponding URL formats.
Also you may download zip file with Fast-Transformer https://github.com/Rishit-dagli/Fast-Transformer/archive/master.zip
Or simply clone Fast-Transformer with SSH
[email protected]:Rishit-dagli/Fast-Transformer.git
If you have some problems with Fast-Transformer
You may open issue on Fast-Transformer support forum (system) here: https://github.com/Rishit-dagli/Fast-Transformer/issuesSimilar to Fast-Transformer repositories
Here you may see Fast-Transformer alternatives and analogs
data-science-ipython-notebooks deeplearning4j machine-learning-for-software-engineers incubator-mxnet Screenshot-to-code spaCy cheatsheets-ai gun php-ml TensorLayer awesome-artificial-intelligence onnx AlgoWiki papers-I-read sketch-code are-you-fake-news EmojiIntelligence tqdm dspp-keras deepo faceai Repo-2017 lambda-packs PyGame-Learning-Environment deep-trading-agent caffe2 AirSim Mask_RCNN keras-js horovod