i use this model by llama.cpp, give a prompt, but nothing return. any suggestion?
· Sign up or log in to comment