Quantized version
#2
by
t1u1
- opened
Can you please create a quantized version of this? 13GB is too large for consumer h/w
Thanks
@t1u1 I am uploading the quantized models in GGUF here: https://huggingface.co/MaziyarPanahi/ChatMusician-GGUF
Thanks for your interest, we will try to implement a quantitative model, probably in the coming weeks.