ValueError: The following `model_kwargs` are not used by the model: ['token_type_ids'] (note: typos in the generate arguments will also show up in this list)
#2
by
Imran1
- opened
i fine tune the model using peft but when i load the model for inference so they give me the following error.
ValueError: The following model_kwargs
are not used by the model: ['token_type_ids'] (note: typos in the generate
arguments will also show up in this list)
I got the same error
I got the same error
I solve the error but the model performance are very bad. I don't know why every model show this type of behavior.
Set 'return_token_type_ids=False' in tokenizer()
If you, for some reason, don't have control over the generate
-call but are able to pass down or modify the tokenizer, you can patch _call_one
on the tokenizer:
from functools import wraps
org_call_one = tokenizer._call_one
@wraps
(org_call_one)
def _call_one_wrapped(*x, **y):
y['return_token_type_ids'] = False
return org_call_one(*x, **y)
tokenizer._call_one = _call_one_wrapped
This forces the tokenizer to never return token type ids.
FalconLLM
changed discussion status to
closed