Do you happen to have a non-GPTQ version of this model so I could convert to ggml for use with llama.cpp
#1
by
spanielrassler
- opened
This is a GPTQ model, correct? Or am I wrong? I although I know they can be converted to ggml according to @TheBloke perplexity suffers noticeably when this conversion is done.
I don't really understand the what's happening 'behind the scenes' with these models, so sorry if this doesn't make sense :) Long story short, I want to try this model in llama.cpp and I need a ggml model to do so. I can do the conversion myself from .bin or .py files but that's about it.
here you go! :) the fp16 model
https://huggingface.co/Monero/WizardLM-Uncensored-SuperCOT-StoryTelling-30b/ looking forward to seeing the GGML versions!
Thanks!! I'll do my best. I'm not @TheBloke so not very good at this yet but at least I should be able to put up :)
spanielrassler
changed discussion status to
closed