`model_kwargs` are not used by the model: ['token_type_ids']
Hello.
I am trying usage your model instead of meta-llama, is it possible? Got net error message:
The following model_kwargs
are not used by the model: ['token_type_ids'] (note: typos in the generate arguments will also show up in this list)
Depending on the kwargs it inherits in your call, some will be unused. Unused args will always break the model.generate().
"It's better to specify them explicitly in the model call and leave out the unused one:
replace **inputs with input_ids=inputs['input_ids'], attention_mask=inputs['attention_mask']
the extra arg is returned by the tokenizer."
Source @steremma : https://huggingface.co/OpenAssistant/falcon-40b-sft-mix-1226/discussions/2#649ab1ad807a6d1fea274772
This is also because you most probably have and old version of transformers, this was fixed in https://github.com/huggingface/transformers/pull/24042