How is this different from TheBloke's quants?
#1
by
Samvanity
- opened
How is this different from TheBloke's quants?
Thank you!
We did not compare the models with the ones by TheBloke. The motivation for building and maintaining our own is to serve our opensource project LlamaEdge. Different from other GGUF models, the "Prompt type" and/or "Reverse prompt" (also see second-state/OpenChat-3.5-GGUF) fields are defined on the model card page. These settings help us improve the existing script to enable our users to run LLMs automatically without providing prompt things by themselves. Thanks!