A repository with exploration into using transformers to predict DNA ↔ transcription factor binding
Implementation of Memformer, a Memory-augmented Transformer, in Pytorch
Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch
A GPT, made only of MLPs, in Jax
Implementation of OmniNet, Omnidirectional Representations from Transformers, in Pytorch
Implementation of Multistream Transformers in Pytorch
Axial Positional Embedding for Pytorch
Implementation of COCO-LM, Correcting and Contrasting Text Sequences for Language Model Pretraining, in Pytorch
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules