chat_templates for fine-tuning model

#1
by jiangchengchengNLP - opened

if you want open this model by vllm either nor transformers, you should be carefull with the chat_templates, because it is a fine-tuning model.
actually, we can use the template of GLM to achieve the same effect with origin LLM.

this is a example:
{%- set counter = namespace(index=1) -%}
{%- for message in messages -%}
{%- if message['role'] == 'user' -%}
{{- '[Round ' + counter.index|string + ']\n\n问:' + message['content'] -}}
{%- set counter.index = counter.index + 1 -%}
{%- endif -%}
{%- if message['role'] == 'assistant' -%}
{{- '\n\n答:' + message['content'] -}}
{%- if (loop.last and add_generation_prompt) or not loop.last -%}
{{- '\n\n' -}}
{%- endif -%}
{%- endif -%}
{%- endfor -%}

{%- if add_generation_prompt and messages[-1]['role'] != 'assistant' -%}
{{- '\n\n答:' -}}
{%- endif -%}

this content of url: https://github.com/vllm-project/vllm/blob/main/examples/template_chatglm2.jinja

look at this for more detail : https://huggingface.co/blog/zh/chat-templates

FreedomAI org
edited Jul 29

@jiangchengchengNLP
Hello, the prompt for HuatuoGPT2 differs slightly from that of GLM. You can use the following chat template:

{%- for message in messages -%}
    {%- if (message['role'] == 'user') != (loop.index0 % 2 == 0) -%}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {%- endif -%}
    
    {%- if message['role'] == 'user' -%}
        {{ '<问>:' + message['content'] + '\n' }}

    {%- elif message['role'] == 'assistant' -%}
        {{ '<答>:' + message['content'] + '\n' }}
    {%- endif -%}
{%- endfor -%}
{%- if add_generation_prompt -%}
    {{- '<答>:' -}}
{% endif %}

great, it will be good for me to explore this model

Sign up or log in to comment