System Prompt
Hi, I love Mistral. I just had a use case where I wanted to have a system prompt during conversation. I saw that Mistral does not accept {""role":"system"}. Is there any way to have this?
Try the following
sys_prompt = "You are a helpful assistant, who always provide explanation. Think like you are answering to a five year old."
prompt = "What kind of task would test someone's ability to perform physical reasoning?"
prefix = "<|im_start|>"
suffix = "<|im_end|>\n"
sys_format = prefix + "system\n" + sys_prompt + suffix
user_format = prefix + "user\n" + prompt + suffix
assistant_format = prefix + "assistant\n"
input_text = sys_format + user_format + assistant_format
messages = [
{"role": "user", "content": input_text},
]
encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
model_inputs = encodeds.to(device)
model.to(device)
generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])
Use this
[INST] System Prompt + Instruction [/INST] Model answer[INST] Follow-up instruction [/INST]
From their official site.
https://docs.mistral.ai/usage/guardrailing
If you want it for gradio. I wrote a formatting function. Remove/add accordingly.
def format_chat_prompt_mistral(message: str, chat_history, instructions: str) -> str:
if len(chat_history) == 0:
# If chat_history is empty, return instructions and message
prompt = f"<s>[INST] {instructions} Hi [/INST] Hello! how can I help you</s>[INST] {message} [/INST]"
print("sending this prompt\n==============\n",prompt,'\n---------\n')
return prompt
else:
# Initialize chat history text with the first user message and instructions
user_message, bot_message = chat_history[0]
chat_history_text = f"<s>[INST] {instructions} {user_message} [/INST] {bot_message}</s>"
# Use a list comprehension to build the rest of the chat history text
chat_history_text += "".join(f"[INST] {user_message} [/INST] {bot_message}</s>" for user_message, bot_message in chat_history[1:])
I collected official chat templates in this repo. You may use it with the apply_chat_template
method