different pad tokens across tokenizers results in errors

#190
by Clement - opened

Hello,

the two tokenizers have differing pad tokens, this can be seen here:

In some cases, like using the compel library with a specific prompt, this will break.
See this related issue on compel: https://github.com/damian0815/compel/issues/94

I am aware of this issue on transformers, but its outcome does not seem clear: https://github.com/huggingface/transformers/issues/24925

Can someone clarify why the pad tokens differ, whether and how they can be changed to become identical, or if there is another fix?

Thank you,

Sign up or log in to comment