Chat template?

#1
by BoscoTheDog - opened

Just to be sure: does this model use the Mistral chat template? The model card doesn't mention it, but does imply it.

i.e. im_start_im_end

H2O.ai org

The model uses a custom chat template, which is described in the tokenizer_config.json (https://huggingface.co/h2oai/h2o-danube2-1.8b-chat/blob/main/tokenizer_config.json#L33). This is automatically picked when using the transformers pipeline.

For more information about chat templates, visit https://huggingface.co/docs/transformers/main/en/chat_templating

H2O.ai org

Added an example to the model card: <|prompt|>Why is drinking water so healthy?</s><|answer|>

psinger changed discussion status to closed

Thanks, now I know where to look for that stuff!

Unfortunately I can't use the transformers pipeline to manage this, since I'm trying to incorporate Danube in a 100% browser-based project. So I have to build the templating engine myself. That's why a clear example of what a finished template should look like would be so helpful.

For example, see https://github.com/ngxson/wllama

Whoa, psinger read my mind :-) Thanks!

Is it all on one line? So even if there are multiple questions and answers in remains on one line?

H2O.ai org

yeah one line:

<|prompt|>Why is drinking water so healthy?</s><|answer|>It is healthy...</s><|prompt|>ok but....</s><|answer|>

etc

My compliments on the model by the way. Now that the prompt is working it's doing really wel for it's size (Q5), and has become a favourite for me. I hope you will make a version with an even bigger context.

Sign up or log in to comment