-
Question is the title; is there QLoRA support for Mixtral by any chance? I set model to Mixtral-8x7B-Instruct-v0.1 and tried it out on my Jupyter Notebook but got this error:
I assume by this that it's not supported :'( |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 6 replies
-
Can you share the command? Mixtral should work |
Beta Was this translation helpful? Give feedback.
Oh that's possible. I would definitely do separate directories for each model you convert. You can specify the ouput directory to convert with
--mlx-path