White paper & reproducible benchmark suite for LLM inference optimization on AMD MI300X using ROCm 6.1
What is the MayurVijayPatil/amd-llm-rocm GitHub project? Description: "White paper & reproducible benchmark suite for LLM inference optimization on AMD MI300X using ROCm 6.1". Written in Jupyter Notebook. Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard — paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the amd-llm-rocm issue tracker:
Open GitHub Issues