Wrong config?

#1
by Nephilim - opened

All this model is giving me is weird tokens as outputs not following input.
image.png

Works fine for me, make sure your llama.cpp is up-to-date:

$ ./main -m ./WestLake-7B-v2-laser-truthy-dpo-Q4_K_M.gguf -i -p "Write a story about llamas"
[...]
 Write a story about llamas and the wild west.

In the heart of the Wild West, there existed a hidden valley untouched by time or mankind. This unique terrain was shrouded in myste
ry, with only rumours echoing through whispers around the campfires at night. The local cowboys would tell stories of a land where l
lamas outnumbered people ten to one. They spoke of a magical place with vibrant grasslands, clear blue lakes and majestic snowcapped
 mountains that seemed to stretch into infinity.

Here, this is the gen result, i'm using text generation webui on the latest version, so I think llama is on the latest version
image.png

Something is wrong with your setup or the model. Try downloading the model again to make sure it's not corrupted. Your screenshot says 20 tokens were generated, but that is a lot more than just 20 tokens of output. Make sure other GGUF models work with your setup as well.

the 20 tokens is from the generation after that, yes other gguf models works well with the same config, even the same template, i'll try to download it again

Sign up or log in to comment