MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
What is the deepspeedai/DeepSpeed-MII GitHub project? Description: "MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard — paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the DeepSpeed-MII issue tracker:
Open GitHub Issues