TransformerEngine

TransformerEngine

NVIDIA

A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper, Ada and Blackwell GPUs, to provide better performance with lower memory utilization in both training and inference.

2.6k Stars
470 Forks
2.6k Watchers
Python Language
apache-2.0 License
Cost to Build
$571.3K
Market Value
$2.51M

Growth over time

2 data points  ·  2025-04-05 → 2025-08-03
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about TransformerEngine

Question copied to clipboard

What is the NVIDIA/TransformerEngine GitHub project? Description: "A library for accelerating Transformer models on NVIDIA GPUs, including using 8-bit floating point (FP8) precision on Hopper, Ada and Blackwell GPUs, to provide better performance with lower memory utilization in both training and inference.". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone TransformerEngine

Clone via HTTPS

git clone https://github.com/NVIDIA/TransformerEngine.git

Clone via SSH

[email protected]:NVIDIA/TransformerEngine.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the TransformerEngine issue tracker:

Open GitHub Issues