Question answering task with falcon model fails with "TypeError: forward() got an unexpected keyword argument 'token_type_ids'"
#75
by
karolzak13
- opened
Hey all!
I really wanted to try out one of Falcon models with question answering task so I followed this tutorial in the docs: Question answering (huggingface.co) since on the top of it it says it's applicable with Falcon models.
So I reused the code and switch distilbert model to falcon-7b and my code fails this error message:
TypeError: forward() got an unexpected keyword argument 'token_type_ids'
My code looks like this:
from transformers import AutoModelForQuestionAnswering
model_name = "tiiuae/falcon-7b"
model = AutoModelForQuestionAnswering.from_pretrained(
pretrained_model_name_or_path=model_name,
cache_dir="/mnt/tmp",
trust_remote_code=True
)
from transformers import AutoTokenizer, PretrainedConfig
tokenizer = AutoTokenizer.from_pretrained(
pretrained_model_name_or_path=model_name,
cache_dir="/mnt/tmp",
trust_remote_code=True,
return_tensors="pt",
)
from transformers import pipeline
question_answerer = pipeline(
"question-answering",
model=model,
tokenizer=tokenizer
)
question_answerer(question=question, context=context)
Code fails on the very last step when I try to run inferencing on the pipeline. Any ideas?
I'm not sure you would be able to use HF pipeline for Falcon for this kind of QA since Falcon is a decoder-only model not expected to produce start/end logits. Do you get any errors when you run?
model = AutoModelForQuestionAnswering.from_pretrained(
pretrained_model_name_or_path=model_name,
cache_dir="/mnt/tmp",
trust_remote_code=True
)