mlx-tuning-fork

mlx-tuning-fork

chimezie

Very basic framework for composable parameterized large language model (Q)LoRA / (Q)Dora fine-tuning using mlx, mlx_lm, and OgbujiPT.

40 Stars
3 Forks
40 Watchers
Python Language
mit License
Cost to Build
$5.5K
Market Value
$10.2K

Growth over time

1 data points  ·  2025-06-11 → 2025-06-11
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about mlx-tuning-fork

Question copied to clipboard

What is the chimezie/mlx-tuning-fork GitHub project? Description: "Very basic framework for composable parameterized large language model (Q)LoRA / (Q)Dora fine-tuning using mlx, mlx_lm, and OgbujiPT. ". Written in Python. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone mlx-tuning-fork

Clone via HTTPS

git clone https://github.com/chimezie/mlx-tuning-fork.git

Clone via SSH

[email protected]:chimezie/mlx-tuning-fork.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the mlx-tuning-fork issue tracker:

Open GitHub Issues