The tokenizer_config.json is not correct?

#2
by CISCai - opened

It just seems to be DeepSeek's one, the prompt template and tokens do not match your example code in README.md

Gorilla LLM (UC Berkeley) org

Yes, thanks for catching this! We really appreciate it. Sorry for the late response! We found that different chat templates are used in various packages, so we haven't updated the official chat_template and recommend developers use our official get_prompt(.) function instead as ground truth to avoid inconsistencies. We'll update the system prompt in chat_template from Deepseek to Gorilla LLM. Thanks for catching this again!

We'll close this issue. Let us know if you have additional questions / concerns, we'll reopen this thread. Thanks again!

CharlieJi changed discussion status to closed

Sign up or log in to comment