The Enterprise-Grade Production-Ready Multi-Agent Orchestration Framework. Website: https://swarms.ai
Plug in and Play Implementation of Tree of Thoughts: Deliberate Problem Solving with Large Language Models that Elevates Model Reasoning by atleast 70%
Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" in pytorch
Implementation of Alpha Fold 3 from the paper: "Accurate structure prediction of biomolecular interactions with AlphaFold3" in PyTorch
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Build high-performance AI models with modular building blocks
Democratization of RT-2 "RT-2: New model translates vision and language into action"
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
The open source implementation of Gemini, the model that will "eclipse ChatGPT" by Google
Implementation of the ScreenAI model from the paper: "A Vision-Language Model for UI and Infographics Understanding"
Effortless plugin and play Optimizer to cut model training costs by 50%. New optimizer that is 2x faster than Adam on LLMs.
Implementation of "PaLM-E: An Embodied Multimodal Language Model"
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
Pytorch implementation of the models RT-1-X and RT-2-X from the paper: "Open X-Embodiment: Robotic Learning Datasets and RT-X Models"
Swarming algorithms like PSO, Ant Colony, Sakana, and more in PyTorch 😊
The World's First AI-Enabled Multi-Modality Native Search Engine
Implementation of "PaLM2-VAdapter:" from the multi-modal model paper: "PaLM2-VAdapter: Progressively Aligned Language Model Makes a Strong Vision-language Adapter"
Implementation of M2PT in PyTorch from the paper: "Multimodal Pathway: Improve Transformers with Irrelevant Data from Other Modalities"
This collection brings together the highest-signal research papers in modern AI from the invention of the Transformer to the frontier work of 2024–2025 into a single, curated map of the field
An implementation of Tree-Attention in PyTorch because it's in JAX for some reason
This repository presents a production-grade implementation of a transformer-based text embedding model inspired by OpenAI's text-embedding-ada-002.