Update README.md
was facing error
ValueError: Input length of input_ids is 20, but max_length
is set to 20. This can lead to unexpected behavior. You should consider increasing max_length
or, better yet, setting max_new_tokens
.
So to fix this, have added a simple max_length
chatbot = pipeline("text-generation", model="mistralai/Mistral-Nemo-Instruct-2407",max_length=30)
@Xenova kindly check this too
Looks good! Although, I would suggest using max_new_tokens=128
(or something similar).
Hello,
I am new, but does set max_new_tokens=128 means that if length of token from the string input is more than 128, the message should be divided in chunks ?
using len( self.tokenizer(input_text)['input_ids'])
If yes that would mean sending to the chatbot the array of the divided message ?
self.chatbot([my_arr_ay_of_inputs_divided], max_new_tokens=128)