Possible to run with llama.cpp now?
#1
by
BoscoTheDog
- opened
Since llama.cpp recently added bitnet support, is it perhaps possible to run this model via llama.cpp by changing the config settings?
I haven't tried the conversion of the model with llama.cpp. But if they are supporting you should be able to run by making necessary changes and do that.
Thanks. I've tried to change the config.json file to make it seem like a BitnetForCasualLM file, but couldn't get it to work (yet) unfortunately. But cool to know that in theory it could work.
BoscoTheDog
changed discussion status to
closed