Example code does not work
I tried running the example code in Colab, got this error
ValueError Traceback (most recent call last)
in <cell line: 10>()
8 model.cuda()
9 inputs = tokenizer("import torch\nimport torch.nn as nn", return_tensors="pt").to("cuda")
---> 10 tokens = model.generate(
11 **inputs,
12 max_new_tokens=48,
2 frames
/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py in _validate_model_kwargs(self, model_kwargs)
1153
1154 if unused_model_args:
-> 1155 raise ValueError(
1156 f"The following model_kwargs
are not used by the model: {unused_model_args} (note: typos in the"
1157 " generate arguments will also show up in this list)"
ValueError: The following model_kwargs
are not used by the model: ['token_type_ids'] (note: typos in the generate arguments will also show up in this list)
This is what chatgpt had to say.
The error you're seeing suggests that the model you are using for generation does not make use of the argument token_type_ids
. This can happen, for example, if you are using a model architecture or configuration that doesn't require or support segment embeddings, which is the purpose of token_type_ids
.
Here's how to resolve this:
- When tokenizing your inputs, ensure that you're not producing
token_type_ids
if they're not used by your model. You can do this by adjusting thereturn_tensors
argument in your tokenizer call.
For example, if you're using the HuggingFace Transformers library:
inputs = tokenizer("import torch\nimport torch.nn as nn", return_tensors="pt", add_token_type_ids=False).to("cuda")
If you're sure your model does use
token_type_ids
but the error persists, make sure you're using the right model and tokenizer pair. Some model architectures or configurations might not usetoken_type_ids
.Double-check that the model you're using is compatible with the
generate
method. Not all models support generation. For example, if you're trying to generate with a standard BERT model, you'll encounter problems as BERT isn't designed for sequence-to-sequence tasks out of the box.Ensure that you're using the latest version of the library. Sometimes bugs or incompatibilities can be resolved by simply updating the library. If you're using Google Colab:
!pip install transformers --upgrade
Lastly, it's always a good idea to check the official documentation or GitHub issues of the library you're using to see if others have encountered a similar problem and if there's a known solution or workaround.